Google’s latest ‘Broad Core Update’ continues a trend of increasing activity

Mar 14th, 2019

The latest broad core update to the Google algorithm was announced yesterday – with SERP trackers recording a few consecutive days of heightened activity – but what does it mean for now and in a broader sense?

Google is furious, the Mozcast is inclement and the Algoroo has been clearing small buildings in a single bound. In fact, as would be expected in a week where Google announced a broad core update, all similar tools are showing increased fluctuation in search engine results pages (SERPs).

serp activity

The update, which was announced via the Google Search Liaison Twitter account, has no specific required action or ‘fix’, but was announced in order to prevent overreaction from webmasters who might see a drop as a trigger for making a lot of unnecessary and potentially damaging changes.

In actual fact, as an article published in Search Engine Land reminds us – this variety of update is intended to reward, or at least recognize, good content that was being overlooked rather than to penalise any other activity.

Nevertheless, there was a blog on every SEO related site rushed out yesterday to cover the announcement. While there have been few updates warned about in advance (the first Mobile Friendly update and the Speed Update being the major exceptions), there has been fairly regular communication – though not as many as the community’s inferred updates might demand.

The reason for this – as posited by Danny Sullivan through his @SearchLiaison Twitter account – is that Google will tend only to announce updates where there is no action that webmasters can take, in an attempt to prevent them making unnecessary changes to their sites.

This, to the relief of the search industry, clearly implies that there are a lot of updates where there are things we can do – but also that these are usually not announced. The exceptions – as a tweet from later the same day states – are larger updates where actionable advice is beneficial.

The introduction of the Search Liaison position at Google – as well as its history of Googlers that are heavily involved in the SEO community (Matt Cutts, John Muller, Gary Illyes etcetera) – shows that Google is fairly committed to outreach, and for good reason. The search community is quick to notice changes, and it’s quick to ask questions – and often make accusations – and as these changes begin to come more rapidly over time, the search community at large can get a little fraught.

The search and digital marketing industry has played a small but not insignificant part in the rise of Google and there has been a historical assumption from some quarters that Google should return the favour and keep us in the loop as things progress – an assumption that has done little but cause resentment as time after time, the ‘we make hundreds of changes a year’ message is rolled out.

The major issue with this is that the technology that powers these updates, the business conditions and even political conditions that drive changes and updates are likely to be orders of magnitude more complicated than when Cutts joined the WebmasterWorld fora back in 2000 as GoogleGuy.

Not only is it likely that machine learning and consequent rapid iteration of updates has made some changes virtually unknowable, let alone communicable to all but the most involved programmers on the Google staff, but there are likely to be a lot of changes aimed at overcoming political, technical and other issues which would represent, and fairly, commercially sensitive decisions.

While I’m no fan of many of Google’s decisions over the last decade, its outreach in the search industry has been reasonable and while they really need to just give us the data on voice, they do provide the search community with a lot of information which can be overlooked in the rush to identify and name the latest update.

Source: Algoroo
Source: Accuranker

As can be seen from the above charts, which show a definitive increase in the frequency of activity over the last five years, we are approaching a point where flux is virtually the typical state of SERPs and this is only going to increase as machine learning begins to take on more of a role – a machine is able to iterate and assess far more quickly than us puny humans, after all.

Therefore, while it will always be beneficial to keep one eye on this fluctuation – to seek to identify trends and shape best practice – the unrelenting fascination with updates is counterproductive. The advice has remained consistent – to create the best possible content, and provide the best possible experience for users.

While the manner of execution may change over time, the advice will remain consistent and we would perhaps be better served by attempting to ensure that we are operating at our best and doing what we do well – identifying trends, building, writing and optimising for the web we want to see.

For the moment, at Click, like most agencies, we’ll be keeping an eye on the SERPs and the tools that track them, trying to make inferences and seek correlates while concentrating on improving content and onsite experience. There are activities we can undertake to build strategies that incorporate the expected actions of Google, and the rising technologies that are likely to shake things up – but expending energy lamenting transparency from Google is a waste of energy that could be better deployed elsewhere.

Keep up to date with the latest in search news by subscribing to our blog – or contact us to find out how we can help your brand worry less about updates.

Facebook Twitter Instagram Linkedin Youtube