Something that we often cover when it comes to organic search (SEO) is the importance of using Schema markup and other elements of structured data to ensure that your content and the components of your website are being read by the search engines, and therefore giving you the best opportunity to rank
Adding markup allows your website to be found easily for the right search terms at the right time and in truth is one of the major factors when it comes to increased relevancy and conversion rates. In recent years the industry has been somewhat divided and there seems to be a battle for the same turf when it comes to the information you add into the HTML/CSS of a website.
Back in 2015 Google announced the launch of their structured data testing school and finally bought it to market in 2017. Last year however they moved across to their rich results testing tool and away from structured data, causing a huge backlash from those in the community.
In a rare move, Google bowed to pressure and ceded responsibility of the structured data directly to Schema.org and this is now the commonplace for the industry. By migrating the structured data testing tool off of their own domain and on to schema.org, Google hope for better support for the industry and said: “Google is committed to offering better support, open standards and improving development experience.”
So, what does this news mean and what should people in the industry be looking for and using when it comes to SEO? Below are some of the definitions of and concepts behind structured data, schema markup and rich results and an explanation of the importance from an organic search perspective.
What is schema markup?
Schema markup is essentially the code that is derived from microdata on schema.org. This site seeks to unify the language used by webmasters to provide metadata on pages which can be easily read by search engine spiders and parsers. Schema markup is how we refer to the microdata code that provides this metadata.
What is structured data?
Rather than being a specific subset or variety of data, when we refer to ‘structured data’ we are talking about an organisational construct of data. While prose may convey information, it tends to do so in an organic way – conveying information in a looser, more conversational manner.
However, if you were to study the prose, and distil its meaning in to, for example, a table – this would represent ‘structured’ data, essentially the same information in an easier to digest format.
Where search engines are concerned, it is easier for an algorithm to parse information if it is offered within a scaffold or framework of structural information. This tends to be done using HTML, microdata and JSON-LD cues that provide the search engine with additional pointers that it can use to determine the nature of the data it is processing.
A lot of the schema markup is particularly useful for inclusion in various types of what Google refer to as ‘rich results’ – these refer to the local packs, the content cards and other various SERPs features that pull directly from the web page of one URL in the SERP rather than from the defined meta description and meta title tags for that URL.
While this is not the only reason to implement structured data, the improved chances of featuring in these eye-catching SERPs features are undoubtedly a major reason to do so. However, structured data also benefits the end user through increased relevance in SERPs and can benefit your brand through reducing the amount of irrelevant traffic.
What are rich results?
‘Rich results’ is the collective term (at least in Google parlance) for a number of various SERP features that aim to improve the user experience (UX) of results pages. These include (but are not limited to):
- Image packs
- Site links
- Featured snippets
- Shopping results
- Top stories
- People also ask
- Knowledge panels
What is the Schema.org markup validator?
The validator is a tool that allows users to both check a URL for semantic markup and a snippet of code for those wanting to look at things such as individual products.
The interface is simple and looks as follows:
By allowing the user to search in both ways there is a greater clarity on the information the search engines look for and gaps that those optimising a website miss out. At a basic level you return results such as the ones below for Nike. Here you can see that the data pulled based on an additional search show the type of content – i.e the webpage, the id (including the tag to determine the country in this case – gb) and both the brand name and the full URL.
If you were to search purely on the code snippet for something like language or location you get the following:
Here you can see that Disney operates one main site and that search engines can determine a number (in this case over 100) subsites and languages.
But that’s not all, using the new tool allows you to see gaps like the ones below:
All of this meta data should be filled in if you want to perfect SEO on a site and it is especially important if you are a brand that operated from many locations. Adding in name, address and postcode, as well as phone numbers, contacts, logo and legal factors allows you to be found for the most relevant searches and in truth will lead to an upturn in most goals from website visits to conversion rate.
What is Schema.org planning?
In a blog post on schema.org, Ryan Levering posted: “As agreed last year, Schema.org is the new home for the structured data validator previously known as the Structured Data Testing Tool (SDTT).
“It is now simpler to use, and available for testing. Schema.org will integrate feedback into its draft documentation and add it more explicitly to the Schema.org website for the next official release.
“SDTT is a tool from Google which began life as the Rich Snippets Testing Tool back in 2010. Last year Google announced plans to migrate from SDTT to successor tooling, the Rich Results Test, alongside plans to “deprecate the Structured Data Testing Tool”. The newer Google tooling is focused on helping publishers who are targeting specific schema.org-powered search features offered by Google, and for these purposes is a huge improvement as it contextualizes many warnings and errors to a specific target application.
“However, many publishers had also appreciated SDTT as a powerful and general purpose structured data validator. Headlines such as “Google Structured Data Testing Tool Going Away; SEOs Are Not Happy” captured something of the mood.”
“Amongst all this complexity, it is important to remind ourselves of the importance of simplicity and usability of Schema.org markup for its founding purpose: machine-readable summaries of ordinary web page content. Markup that – when well-formed – helps real people find jobs, educational opportunities, images they can re-use, learn from fact checkers or find a recipe to cook for dinner.
“This is the focus of the new Schema Markup Validator (SMV). It is simpler than its predecessor SDTT because it is dedicated to checking that you’re using JSON-LD, RDFa and Microdata in widely understood ways, and to warning you if you are using Schema.org types and properties in unusual combinations. It does not try to check your content against the information needs of specific services, tools or products (a topic deserving its own blog post). But it will help you understand whether or not your data expresses what you hope it expresses, and to reflect the essence of your structured data back in an intuitive way that reflects its underlying meaning.”
Why does it all matter?
Well the truth is that the information that we plant into the backend of a website and the technical SEO that takes place behind the scenes is exactly what is needed to help your site rank for better search results and higher up the SERPs.
As previously mentioned, while it may not be the way of the far future, structured data and schema markup is certainly the way of the present and the near future to ensure that you are allowing search engines and various digital assistants to correctly apportion relevance to your content.
As the ability of AI and machine learning algorithms grows, it may be that we can abandon the practice of planting such microdata signposts for search engines to follow, but for the time being there are both short and long term gains to be had from offering this helping hands to the various bots, spiders and crawlers that wander daily over our sites.
Whether it’s improved relevance to search, or better visibility – both of which can have a tremendous impact on your bottom line – there are plenty of reasons to ensure that you’re employing structured data and schema markup, but there should be few reasons more convincing than that in an industry where things are so prone to change and change quickly, markup may be one of the few things we can reasonably infer, from the actions of the big search engines, will help to future-proof your search marketing.