Understanding that no two SEO campaigns are the same is a challenge in getting wider business buy-in to the project.
One example that sticks out in my mind is working with a global travel company who was comparing their website and infrastructure with that of BBC News.
“The BBC doesn’t do it that way,” was a common phrase thrown into the mix.
Having worked with a number of organizations, ranging in size from sole traders to IPO SaaS companies, one thing is apparent.
While SEO best practices may be a consistent benchmark to aim for, no two successful SEO strategies are the same – even if two websites are in the same vertical.
When we look at “ranking factors” or things to move the needle in favor of our client’s websites, there are a large number of factors that need to be taken into account including:
All of these factors are dependent on the business and the business model.
And, unless you have inside information, it’s highly unlikely that you can answer the same points for your competitors as you can for yourself.
This is why there is a lot of debate within the SEO industry about pretty much anything to do with SEO.
A prime example being the ongoing debate around just how valuable inbound links are in terms of the wider mix. In some verticals and for some websites, they’re a lot more important than others.
This is because as SEO professionals, we’re all exposed to varying levels of challenge.
In the past, Searchmetrics has produced a ranking factor study looking at “universal ranking factors”, as well as weighting them by industry – to show the differences between them.
Comparing the Travel Industry and the Finance Industry from the study, top factors that correlated with higher performance within organic search were:
In comparison to the top correlating factors in the Finance Industry being:
So does this mean that internal linking structures aren’t important in the Finance Industry and shouldn’t be a key part of your SEO strategy?
While studies like this are great, they only cover quantifiable elements that can be easily measured across a large sample of websites.
They don’t take into account more objective and subjective factors, such as brand and offline presence.
These do however show the different approach taking in each vertical, and this is majorly influenced by business model and product type.
You’d expect a travel website to contain a large number of images and contain lots of guides talking about destinations with persuasive copy.
Whereas from a financial website or bank, you’d expect non-verbose copy that’s straight to the point with the information the user needs.
Correlating factors aside, some elements of organic search (within Google) are universal and should really be seen as basics across all websites, with strategy then being overlaid on top. These are:
Important for SEO and site usability. Not so much a problem today, given the increased spotlight that Google gave site speed in previous years, making it a staple part of the SEO auditing process.
The majority of websites also now run behind CDNs that allow for things such as JS/CSS and HTML compression.
Given the focus of site speed in general marketing articles, more website owners who may not be as technically savvy are more aware of its importance – and the same can also be said of HTTPS.
If you run a JavaScript website, it needs to either use dynamic rendering or server-side rendering.
It’s also important to consider the differences between the HTML response and the rendered response, and have a non-JS fallback solution in place if possible as this can affect:
Whenever I’m talking about comparing two or more websites, I refer back to a Moz Whiteboard Friday from 2016. It looks at how Google may (or may not) interpret and evaluate the value of your content in comparison to the wider web corpus.
Although the video and transcript don’t directly mention machine learning and Google’s artificial intelligence, Rand Fishkin did talk about how Google assesses the web corpus and takes learnings from it – using the example of granola bars.
This is were approaches and strategies between competing websites tend to overlap.
In the example, when Google is assessing the quality of a page talking about granola bars, it’s looking at the web corpus for information and typically, you’d find pages containing nutritional value tables and lists of ingredients and allergens.
From this, they would see keywords such as [calories], [fats], [sugars], and then differentiating keywords such as [organic] and [vegan].
Applying this to another vertical such as travel, a page targeting [italy tours] may also talk about [rome coliseum], [milan], [venice], or [pompeii].
If Google finds 100 websites with a page targeting [italy tours], and 87 of the 100 contain a group of related topics and keywords and 13 don’t, machine learning (and logic) will go with the more consistent corpus.
This is also when you break away from just looking at single pages in silo, and start looking at:
Because Google is aware of what the web corpus is saying, you need to be in line with this and then develop the strategy to go one better – not just mirror what Google already sees as being rank worthy.
It’s important that you use the right strategies for your business model and target user base – taking into consideration your competitive space.
While some technical elements of SEO are universal, it’s not possible to blindly imitate strategies between websites and expect the same results.
This is why it’s difficult to forecast the impact of specific SEO tasks.
Due to the sheer number of variables (internal and external), no two websites are “the same”.
Saying that a website is “similar” is not enough.
All screenshots taken by author, August 2019