Content
Technical SEO – Why should you care?
Apr 26th, 2021Technical SEO is the process of ensuring that your website can be crawled and indexed more efficiently by search engine spiders with the goal of improving organic rankings. Important elements of Technical SEO include crawling, indexing, performance, and website architecture.
Within these Technical SEO pillars, there are a number of factors that can affect your organic search performance. To create great websites that are both SEO friendly and provide exceptional user experience, you need a solid Technical SEO foundation that complements quality and authoritative content.
Crawlability – Can search engines find your site?
A search engine like Google, consists of a crawler, an index and an algorithm. The crawler, also known as a bot or a spider, follows the links to find your website and the pages within. Crawlability has to do with how easy Google can crawl your site.
This could be affected by various factors – rules in the robots.txt file, site architecture, internal linking structure, XML sitemap, broken pages, server errors, duplicate content, and crawlable JS and CSS files. These elements can affect your rankings as well as the site’s crawl budget, which is why it’s important to constantly check and improve the structure of your site and remove or update any outdated content.
Think of it this way, crawling costs Google a lot of money and if the crawl spiders continue to find broken pages on your site, they will decide to crawl it less frequently which becomes a problem when fresh content is uploaded but it takes much longer for the crawlers to find it. Crawl depth is another element to be considered when analysing your site’s structure. Are the most important pages of your site only a few clicks away from your Homepage? Do you have heavy blog categories with valuable and authoritative copy hidden 10 or 15 clicks away?
The good thing is that we can communicate to the crawler through our robots.txt file and help the spider understand what pages we would like crawled. For example, a common practice is to exclude onsite search results via the robots.txt file to save crawl budget. You can also prevent certain types of files such as PDFs from being crawled and found by the bots. That said, it is important to regularly review your robots.txt file and also check your Google Search Console’s Coverage report to discover and resolve crawling issues.
Indexation – Is your site found in the index?
Crawling and indexing are not the same thing. Even if your pages have been crawled, they may not appear in the index. For example, if you have included a noindex tag to a particular section of your site, Google will not include the specific URLs in the index. However, Google may also decide to not include pages in the index due to duplicate content or redirect chains. Comparing the pages submitted in your XML sitemap and the ones found in Google’s index is a good place to start reviewing whether a particular set of pages is not in the index.
Performance
Page speed has been a ranking factor for a long time but we see Google placing even more value on page experience with the Core Web Vitals officially coming into play in May 2021 as part of the Page Experience Algorithm.
The three metrics – LCP, FID and CLS have made the headlines more than once in the past couple of months and for good reason. Google considers these metrics important in the overall user experience and webmaster should strive to make sites as fast as possible. There are now different ways to audit your site to discover Core Web Vital issues. This requires more bespoke approach and there is no one size fits all solution. Start by examining your Mobile and Desktop reports.
Whilst Google Search Console only shows a sample of pages within each category, you can still use this to carry out further analysis into what elements could be causing page speed issues. Using Lighthouse of Page Speed Insights you will be able to pinpoint specific issues for each page on your site. For example, check whether the server response time is too long or whether key requests such as fonts have been pre-loaded. The Page Experience update that is scheduled to roll out later this year also takes into consideration site speed, safe browsing, intrusive interstitials, and mobile-friendliness. As part of a site’s performance, we also assess server and client-side JavaScript rending.
Is the content found on your site dependent on the client-side Javascript or not? HTTP status codes, structured data, thin and duplicate content, and canonical tags are also key factors we review when we assess a site’s performance.
Site Structure and Navigation
When we talk about the site structure of a site, we refer to how well the pages on the site are organised and how the relationship between categories and subcategories is communicated to the search engines. Site structure also plays a really important role in your internal linking and helps spread link authority through the inner pages of your site. With this in mind, you want your site structure to be well-organised, ensuring that that search engine bots and customers can easily find your most important pages.
This is where we look at crawl depth and orphan pages as poorly designed site structures can result in creating new pages that are not linked to from anywhere on the site. So what would be the optimal structure for a website? The below graph is taken from Moz and it perfectly demonstrates an SEO-friendly menu structure, which has a minimum number of links between the Homepage and the inner categories.
Why should you invest in Technical SEO?
We continue to see fantastic web designs across different industries. Sites with amazing functionalities and impressive branding stop us in our tracks but the reality is that even with the most inspiring sites, if Google can’t find them, neither will your customers.
This is where SEO comes in and it can help strengthen brand recognition, build trust and credibility with your existing audience and open doors to potential customers. Technical SEO goes hand in hand with keyword research and unique content and it is the backbone of strong organic performance.