5 things that will have the biggest SEO impact to your traffic

Nov 17th, 2022

In the modern world of SEO, the ways in which websites can see increased search traffic are almost endless, but there are some key areas that every site should have under wraps first. This article will dive into the top 5 areas which are likely to have the biggest impact on your SEO traffic if you properly optimise them.

As any seasoned SEO will know, great keyword rankings will drive more traffic through to your site from search. However, you can’t just target any old keyword, rank number one and call it a day. The strategy that you compile is very important to ensure that you’re ranking for keywords that will give you the highest return.

What makes a good keyword?

A good keyword is typically defined as:

  • Not too generic
  • A generic keyword such as “tent” would be too high in competitiveness to be a viable target for most sites.

  • Relevant
  • Only target a keyword to a page for something you actually offer or is relevant to your offer.

  • Has appropriate search volume
  • Targeting a keyword with low volume when keywords with more sufficient volume exist is unlikely to be advantageous unless you have exhausted other options first.

  • Has a relevant search intent
  • For example, targeting a keyword to a page which sells products but the sites in the search results for that keyword are informational only would represent the wrong search intent. Always check this before you decide to target it.

  • Covers questions that users are likely to ask
  • Keywords aren’t just restricted to looking for products or services. You should also target question-based queries that users are likely to ask.

Common keyword targeting mistakes

There are a few common targeting mistakes that you’ll want to avoid in your keyword strategy

  • Targeting an unreasonably high difficulty keyword
  • A high difficulty keyword may be unattainable due to well known competitors dominating the search results due to their superior authority. This is common for broad short-tail keywords such as “tent” and can be avoided by targeting a longtail variant instead. Keyword planning tools will usually tell you the difficulty of a keyword.

  • Targeting only one keyword to a page
  • A page will usually target multiple keywords which have close relevancy. If you restrict your strategy to only a single keyword, you’re likely to limit the available searches into the page.

  • Targeting too many keywords to a page
  • Too many keyword targets may end up with your content being too diluted for any of the keywords to rank well. Don’t over-extend the page’s purpose – instead, consider creating a new page for additional targets.

Putting this strategy together requires keyword research which can be carried out using a combination of tools such as Google’s Keyword Planner, Ahrefs keyword explorer, Keywordtool.io and many more. There is no single clear-cut way of performing keyword research as there are a lot of methods available, and every SEO will do theirs a little differently.

Perhaps by far the most important part of search optimisation has to be the page content. Search engines, Google in particular, have gone to great lengths to ensure that websites appearing in search are providing the best content possible to searchers. They have achieved this through continued refinement of their algorithms to favour high quality and relevant content. This started with a very early algorithm named “Panda”, which has over time progressed into what we now know as the “Core Updates”. The quantity of updates to these algorithms over the years only reinforces the importance of getting content right.

So why is content so important? Well, without it, search engines can’t determine the purpose or relevancy of a page because there would otherwise be insufficient context for it to work with. Without context, they can’t determine which keywords the page is targeting, and therefore they can’t show the site in search results that would be useful to a searcher.

There are some commonly ignored factors that can help with content quality:

  • Focus on intent
  • The intent of a search is knowing what your users looking for. Are they looking to purchase a product, or just looking for information? Or maybe they were just looking for your website by brand or product name? Knowing this is important to understand how your content should be written so that the user can achieve their initial goal.

  • Can it be trusted?
  • Searchers need to be able to trust your content for it to be considered high quality. You can do this by always being truthful, including common questions and answers, avoiding grammatical and spelling errors, sticking to facts and avoiding opinions, and linking to credible sources to back up your claims.

  • Was it written by an expert?
  • Content should ideally be written by someone that is an expert in the field of the topic. This could be by having relevant educational qualification, experience in the field, or by being written by somebody working for your business. It should also offer insights and depth of knowledge where necessary.

  • Is it easy to read?
  • The readability of your content should be appropriate to the target audience and avoid unnecessary jargon. It should also be presented with appropriate headings, sub-headings, and paragraphs.

  • Can you justify your claims?
  • Use citations to back up any claims that you make within your content, and link to appropriate sites if needed.

  • Did you use AI to write it? (Don’t!)
  • Modern day AI content is getting very good at mimicking human written content, but it’s still never as good. Algorithms can now detect such content and it will be flagged as low quality, so it is important that you avoid it.

  • Check your word count
  • An appropriate word count can go a long way in content optimisation. The length alone doesn’t make your content strong, but it is important that you make it a sufficient length that it covers enough detail to form a strong page and context, but not so long that you begin to repeat yourself or the quality suffers.

On the surface, a website is just a construction of appropriately linked pages with an aesthetically pleasing design, but underneath there’s a whole host of hidden technical elements, which all need to be aligned for a website to function properly. Many of these technical elements are particularly important for search engines to properly index your site.

Some of the most important technical elements are easily overlooked by web developers that do not have SEO experience, so it’s important to be aware of them and the implication it can have on your website’s traffic if they’re not correctly configured.

Top technical elements that you should focus on

The most important technical elements are those which prevent your pages from being properly indexed, for example:

  • Noindex robots meta tag or robots.txt disallow rules that can prevent pages from being indexed at all.
  • Incorrectly configured canonical tags can result in incorrect pages being indexed.
  • Improper redirects, such as a 302 temporary redirect will usually result in pages not being indexed.
  • Infinitely generated URLS, such as those generated by parameters, will result in pages which are not useful being indexed, taking focus away from your primary pages.
  • Poorly configured hreflang tags could result in the wrong or no alternative language page from being indexed.

Other technical elements to consider

But the technical elements don’t stop there – there’s plenty of other ways a page can be prevented from being indexed too. For example:

  • A poorly designed page that inadvertently (or deliberately) hides content can result in the content being ignored by search engines, so it won’t count towards the strength of the page.
  • Using JavaScript to inject content into the page can prevent search engines from being able to see the page as it was intended.
  • A page which contains a huge asset payload could cause a page speed issue. Page speed is now a part of Google’s search algorithms (more on this in the next section).
  • … and more.

If a page can’t be indexed at all, then it’s most likely not going to be shown to searchers at all. If it can’t be indexed as it was intended, then the chances of it being shown to searchers as a search result is less likely.

A page that isn’t shown in the search results will not gain traffic from this source, which can represent a significant proportion of the total traffic to the page in some cases. To prevent this from happening, you should regularly carry out top level technical SEO checks on the site regularly, and an in-depth audit once or twice a year, or after any major changes are made to the site.

Google in particular has made it clear over the years that the faster a website performs the happier users of your website will be. To this end, page speed is now beginning to become a factor in Google’s search algorithms which means that you need to start paying more attention to how well your website performs.

There are plenty of tools available that can help you measure page speed, but the most useful is Google’s own PageSpeed Insights tool, and Core Web Vitals which is built into Google Search Console. As these tools were built by Google, you can use them to ensure that you’re meeting their recommended guidelines.

Page speed isn’t just about how fast your page loads, but also the user experience during that time too. Page speed has been broken down into key areas by Google called “core web vitals”, which is designed to help you understand what is slowing the site down, from rendering speed to how long it takes for the page to be usable. Each site is given a “pass” or “fail” after being assessed on a number of relevant metrics.

Further to core web vitals, further diagnostic and opportunity recommendations that help you to fix slow page speed and metrics which fail core web vitals are also provided.

Some of the most common causes of slow page speed are:

  • Unused JavaScript and CSS
  • Any JS or CSS code which is loaded into the page but is unused indicates wasted resources downloading the code. Whenever possible, limit the amount of JS and CSS that goes unused on a page by splitting your files so that only the code needed for each page is loaded.

  • Unoptimised Images
  • All images on the site should be optimised as much as possible. Optimisation consists of reducing the file size without harming image quality, using a suitable resolution for the size of the container it will be used in, and using an appropriate file type.

  • Third Party Scripts
  • Any scripts which are loaded into the site will slow it down to some degree. As some will be unavoidable (such as analytics), keeping these to a minimum is our recommendation. Whenever possible, host the scripts on your own website so that overheads are reduced.

  • Slow Server Response Time
  • An overloaded web server or poorly coded website can cause slow processing time on your website. Your response time should always be as fast as possible. Anything over 0.5 seconds will be very noticeable by users.

Although there has been plenty of speculation over the years that links to a website are becoming less of a consideration in search algorithms, there is no evidence to support this theory and it is very much still a large part of overall search optimisation.

How do you earn natural links?

  • Publish high quality ‘evergreen’ content
  • The best kind of content is content that remains relevant for a long period of time, that is, it is not time sensitive. By having content that lasts a long time it has an ability to gain ranking traction over a long period of time.

  • Write high quality guest blog posts
  • Guest blog posting is still relevant today, but it should only be written for an established and authoritative site rather than one of low quality.

  • Use social media to drive your content to its target audience
  • Your content is no use if people are not seeing it. This won’t have any impact on your rankings, but it can raise awareness of the article resulting in more social shares that drive traffic, and you might gain some natural links in the process.

  • Build something better than the competition
  • If your content is better than something that the competition has already written and is valuable, then it is likely to gain attention.

  • Write regularly
  • Writing to a regular schedule establishes your website to be a reliable and trustworthy source of information on the topics that you write about.

Some common link building mistakes

  • Publishers not using the correct “rel” attribute on links
  • There are several rel attributes used in HTML to define the type of link that is being placed. These include “nofollow”, “sponsored” and “ugc” for links which do you not want link equity to passed from because the site is not fully trustworthy, is a paid or sponsored link spot, or comes from user-generated content. Not using these correctly can increase the risk of the link simply being devalued or, at worst, a manual action penalty being applied (although this generally happens only if you have a significant number of them).

  • Earning links from irrelevant sites
  • Naturally, links will want to be earned from sites which cover relevant topics to your own. It would be considered unnatural if you have a significant number of links from sites which don’t cover the same topics and would be considered irrelevant. It’s not possible to fully avoid all natural links like this but being selective about the audience you’re outreaching to is important to reduce it.

  • Not reviewing the toxicity of your existing backlinks
  • Backlink quality doesn’t stay static from the day you gain a link, so bi-yearly reviews of your backlink profile can be beneficial to ensure you’re not building an overall toxic profile. Tools such as SEMrush’s Backlink Audit Tool can help you with this analysis, from which you can build an appropriate Google disavow file. You generally shouldn’t be disavowing just individual poor looking sites as Google takes care of these quite well, but you should be looking to disavow low quality themes of sites such as affiliates, gambling, or other obviously malicious themes.

  • Ignoring technical and on-page aspects of SEO
  • Sometimes link building isn’t the only answer you’re looking for to improve your site’s performance. If you’re neglecting your on-page and technical SEO, then you will only get so far with your link building efforts. Regularly run technical analysis on your site and ensure that your content is relevant and of high quality.

Contact us today to find out more

about how Click Consult offer ongoing technical consultation that can help you to increase traffic to your site

let's chat
Facebook Twitter Instagram Linkedin Youtube