Just type away and hit enter

Blog.

101 – Understanding the Indexing API

When it comes to search and digital marketing, one of the most important things that you have to consider is where your information ranks on search engine results pages (SERPs). You can have all of the content in the world and yet if it’s not indexed properly and if the Googlebot can’t crawl it, then it is essentially useless


One of the tools that businesses need to be aware of, and to use, is the Indexing API. This 101 gives you an idea of what it is and how to use it, as well as wider reading on the importance of using APIs to build bridges between different platforms and tools.

What is API?

Application Programming Interface, or API, is a set of functions and procedures allowing the creation of applications that access the features or data of an operating system, application, or other service.

API is essentially a means of allowing different applications to communicate between one another. In the business sphere that we operate ‘API ‘is something of a buzzword and is only going to become more common as businesses build towards a more integrated way of working.

Automation is vital and an API could make it easier to track certain metrics, and tailor your future strategies based on the data that they return. Another key benefit is the ability to present data in a number of different ways for a wide selection of people. It can be comprehensive for those that need it to be and succinct for those which only require a top level view.

Benefits of APIs

There are many benefits of using APIs to further your performance. Here are a few of them:

  • They’re cheap – many APIs are free at the point of use
  • They are fast – the tools have already been created and there for you can get going with them straight away
  • They present data in exactly the way you want
  • They build bridges between apps and other platforms

Difficulties of APIs

Whilst there are many benefits of using APIs, there can be some difficulties. Speaking at the Benchmark Search and Digital Conference, Victoria Olsina from Barclays said: “I started reading a lot of documentation a lot of resources and I found this phrase ‘you can use the API’ – true you can use the API, but only if you know how to code. Unless you know how to code this can be regarded as fake news.

“I used the Google Page Insight tool and wanted to look at mobile Page Speed, mobile visibility and desktop Page Speed, but the problem was that nobody understood the data. I had to make it look nice to get buy in from those at my organisation but couldn’t combine it all, that’s where the API came in.

“When it works it works, but it can be very difficult, that is why it’s such a technical aspect of search.”

Some of the other considerations that you have to take on-board have been highlighted by research at ProgrammableWeb. They say: “While using APIs is a great idea, there are some things to be cautious about. Most importantly, you should always be concerned with how reliable the service you’re piggy-backing off of is. Ask any developer who’s been relying on Twitter APIs over the last few months and they’ll share many horror stories of service outages and various other problems.”

It is also worth noting that although the services are often free which is a huge positive. This contributes to the reason many projects fold. It is not the service’s responsibility to keep your app operational. A service can shut down or limit how you use its API at any time so you must make sure that you are not too reliant or that you can create your own.

What is the Indexing API?

When it comes to SEO and getting your site to the top of SERPs, you not only need to think about the content you have on your site, how authoritative and relevant it is, but where it sits on your site. Google uses regular crawls to make sure they have the most up to date picture of your site and this is where the Indexing API comes in.  The Indexing API allows any site owner to directly notify Google when pages are added or removed. This allows Google to schedule pages for a fresh crawl, which can lead to higher quality user traffic.

In its current guise, the Indexing API is used for pages that are either live or very frequently updated. Google developers suggest that currently, the Indexing API can only be used to crawl pages with either job posting or livestream structured data. For websites with many short-lived pages like job postings or livestream videos, the Indexing API keeps content fresh in search results because it allows updates to be pushed individually.

If we look at one of these examples it is clear to see the benefits. The livestream feature adds a live badge to video thumbnails in search results and is useful for generating traffic for short lived video content. The live badge can be applied to any public video that is live streamed for any duration of time. Here are a few examples:

  • Sporting events
  • Awards shows
  • Influencer videos
  • Live streaming video games

Once your video is live and streaming you will want to appear in all searched for the keywords related to that term. If we take an example of ‘The Oscars’, many people will only fid your livestream if it has been crawled. It is therefore vital that you call on the Indexing API. Developers at Google recommend the following strategy:

Use the Indexing API to ensure that Google crawls your livestream quickly. Call the API for the following events:

  • When the video goes live
  • When the video has stopped streaming, and the page’s markup has been updated to indicate the endDate
  • Whenever a change has happened in the markup and Google needs to be notified

What can you do with the Indexing API?

Some of the main features of the Indexing API are as follows:

  • Update a URL: Notify Google of a new URL to crawl or that content at a previously-submitted URL has been updated.
  • Remove a URL: After you delete a page from your servers, notify Google so that they can remove the page from their index and so that they don’t attempt to crawl the URL again.
  • Get the status of a request: Check the last time Google received each kind of notification for a given URL.
  • Send batch indexing requests: Reduce the number of HTTP connections your client has to make by combining up to 100 calls into a single HTTP request.

Setting up/syncing the API

As I’m sure that you are aware the Indexing API isn’t a necessity, as Google will still crawl the sitemap of your website and rank you accordingly. The difference, however, is this is a lot more manual. Google itself recommends using the Indexing API instead of sitemaps because the Indexing API prompts Googlebot to crawl your pages sooner than updating the sitemap and pinging a crawl request over.

To use the Indexing API, you need to:

  • Complete the prerequisites by enabling the Indexing API, creating a new service account, verifying ownership in Search Console, and getting an access token to authenticate your API call.
  • Send requests to notify Google of new, updated, or deleted web pages.
  • One thing to consider is that you may need more quota than the default to allow crawls.


To request more quota, follow the steps below:

  1. Go to the Google API Console.
  2. Select Quotas. A Requests quota limit window displays.
  3. Click Edit.
  4. Click Apply for higher quota.
  5. Enter the required fields.

Click Consult’s blog is a resource for thousands of search marketers every month; why not sign up to avoid missing out? Alternatively, check out our resource section for more helpful content or contact us today to see what we can do for your brand.



Share this:


View all posts

SEO Health Check

You don’t need flying reindeer or hundreds of elves to be successful this festive season. Make sure your seasonal marketing strategy sparkles

Read Now
View all Resources

We use cookies to give you the best experience on our website. If you continue without changing your cookie settings, we assume that you consent to our use of cookies on this device. You can change your cookie settings at any time but if you do, you may lose some functionality on our website. More information can be found in our Cookie Info and Privacy Policy.