Google’s Webmaster Guidelines: What the updates mean

Feb 12th, 2016

The latest updates to Google’s Webmaster Guidelines, which are designed to give greater clarity to site owners about what Google is looking for when it crawls sites, emphasise even further the importance of user experience


Our Senior Organic Search Strategist Stuart Jones has analysed the updates and identified how they can help webmasters rank and perform better:

  • Every page should be accessible from another page. The anchor text, whether it’s a text link or image alt attribute, should be relevant to the landing page.
  • Provide a human readable sitemap as well as an XML sitemap. The importance of an HTML sitemap for organic search benefit has been debated in recent times. The fact that Google has emphasised its significance in the updated guidelines says a lot about the importance of user experience for SEO performance. This point also seems to go hand in hand with the first point, since the HTML sitemap is usually accessible from the footer of every page, thus allowing accessibility from all pages.
  • The maximum number of combined acceptable internal and external links on any one page is ‘a few thousand at most’. Old guidelines stipulated that this be kept to ‘a reasonable number’.
  • The crawling of ‘search results’ pages and URLs which include session IDs and/or other unnecessary URL parameters should be prevented via the robots.txt file, since these pages add no value from the search engine’s perspective and therefore do not require indexation.

Google often makes minor updates to Webmaster Guidelines, but these updates are somewhat more significant and should be considered more carefully by site owners, businesses and SEOs

Stuart Jones,  Senior Organic Search Strategist


  • Page titles should be descriptive, specific and accurate. The old guidelines omitted the use of the word ‘specific’.
  • A clear, conceptual page hierarchy should be in place. The use of the word ‘conceptual’ here is particularly important, since if a user cannot quickly understand the concept of the site’s hierarchy, then a crawler bot will likely struggle also. An eCommerce website, for example, relying on filters for its navigation would be difficult for a user to understand conceptually.
  • Valid HTML5 should be used at all times.
  • Allow crawling of all site assets which affect the rendering of web pages, including CSS and JavaScript (JS) files. Google is aware of the massive part JS plays on the contemporary web, and allowing Google to read and interpret a page’s JS will likely aid the search engine’s understanding of how a page is designed and displayed to users.
  • A page’s most important content should be displayed by default, rather than ‘hiding’ the content under ‘Read More’ expanding sections or tabs, since this is seen as being less accessible to users. At Click Consult we’ve witnessed examples of this, and when searching for the copy behind these expanding tabs using an ‘allintext:’ search operator, we’ve found that the copy is not indexed.

Contact us today to request a complimentary organic search (SEO) analysis of your website.

 

Facebook Twitter Instagram Linkedin Youtube