This article addresses shaky SEO strategies that are promoted as important but are outdated. But more importantly, this article will show how some ideas are true or not.
Building a better SEO B.S. Detector will show you how to be more discerning of what you are told to believe and how to detect SEO B.S. when confronted with it. I believe it will make you a better search marketer.
The Problem with Common Sense
The problem with good ideas is that they are reasonable and make a lot sense. For thousands of years the idea that the world was flat was reasonable and made a lot of sense. That is an example of how relying on common sense and “good ideas” can lead to bad ideas.
Correlation Does Not Imply Causation
For example, several years ago someone observed that the top ranking sites across thousands of keyword phrases all had something in common: The top ranked sites had a strong social media presence, particularly in the form of Facebook Likes. The conclusion was reached that Facebook Likes must be a ranking factor. Makes sense but it was not true.
There was an easy way to disprove that theory and that was by reviewing the latest information retrieval literature (patents and academic research). A review of that literature would have demonstrated that there was no research or patent on a successful system for using social media signals as a ranking factor.
The Problem with Observation
Another area to be aware of is the problem of observation. The perspective of the person viewing an event will change how they view that event.
Remember the old fable about the blind men who were trying to describe an elephant? One blind man grabbed the trunk and concluded that an elephant is like a snake. Another blind man ran his hands across the legs and declared that an elephant is like a tree, and so on.
The same thing happens in the SEO industry where people in different niches experience an algorithm change in different ways. What usually ends up happening is that the people with the loudest voices end up dictating what the rest of the industry believes happened and those with contrary experiences are ignored.
This happened with the very first Phantom Update where the industry rushed to embrace the idea that Google was targeting sites with poor user experience while ignoring the insights provided by a HubPages blog post about the Phantom Update that noticed that the change did not affect their site the same way, that certain parts were affected more. Nobody asked why the so-called update would affect some pages more than others. It’s possible that the 2015 event that was called the Phantom Update was about more than the user experience and too many ads.
@martinibuster very. core update == phantom ¯\_(ツ)_/¯
— Gary Illyes ᕕ( ᐛ )ᕗ (@methode) March 21, 2017
Consider the Perspective from Multiple Experiences
It turns out that those updates were not actually algorithm updates. They were simply Google adjusting various parts of their existing algorithm.
Nothing was added to the algorithm, it was simply an update to their core ranking algorithm. An actual update has traditionally been when Google added something new to the algorithm, typically to target a certain kind of spam or improve their ability to understand natural language and so on.
Not All Updates Target Spam
This is very important. Every time a change happens at Google, the SEO industry responds by declaring that Google “targeted” a certain kind of spam.
Historically, and this is a fact, Google updates were just as likely to be an improvement in its ability to “target” high quality content as it was to “target” low quality content. But for some reason the SEO industry is obsessed with interpreting every update as targeting low quality.
Consider a scenario in which Google rolled out an algorithm that improved its ability to understand content itself and how to connect that content to the user intent behind each search query. In that scenario low quality content that targeted keywords (but not a users intent) would naturally stop ranking and higher quality content that matched a searcher’s user intent would start ranking.
This kind of change would affect a wide range of websites. Sites that were ranking because of their links would stop ranking if their content was not identified as an accurate answer to a search query. The same could happen to other sites that contained too many ads.
The abundance of ads were not a factor, the inability to properly answer the query were the factor that kept that site from ranking. Similarly, for another site, the abundance of anchor text in links were not the factor that kept another site from ranking, it was the inability to properly answer the search query that was the answer.
That’s the effect of an algorithm focusing on better understanding content, but not an algorithm that focuses on demoting poor quality sites or sites with excessive anchor text.
That kind of core update would not be targeting low quality sites but the SEO industry would perceive it as doing that; and the SEO industry would have been wrong. I believe the SEO industry has been wrong about Phantom and this explanation is why: Not all algorithm changes target spam.
Look for Validation of a Theory
The hallmark of a shaky SEO strategy is the lack of links to patents or scientific research. Always check to see if a claim is backed up by a link to a patent or research because those links will prove that the hypothesis has a chance of being real.
Any time an SEO Guru makes a claim about Google’s algorithm, look for a link to a patent. Photos of Google analytics, photos of check stubs and personal anecdotes do not count as much as solid proof that something is possible.
I’m going to discuss some of these hare brained SEO strategies currently making the rounds to illustrate how this works.
1. LSI and SEO
LSI (latent semantic indexing) was invented to help search engines distinguish between the multiple meanings of a word.
That was over 20 years ago.
LSI has little to zero relevance to how search engines rank websites today.
Search engines today incorporate databases of information to understand concepts such as people, places and things and how those concepts relate to keywords. Search engines today can promote a site from the second page to the first page of the results page if the algorithm understands that it answers the question better than the core ranking algorithm does.
LSI is training wheels for search engines.
Here is more reading about LSI:
2. Link to .Edu and .Gov Sites
This is a strategy that many SEOs will go to their graves clinging onto. Linking out to authority sites was an idea based on the HITS algorithm, from around 1996 or 1999.
Search for Hubs and Authorities + Kleinberg and you’ll find everything you need about it. That very ancient research is where the SEO strategy of linking out to authority sites started.
Later on, SEOs adapted that hubs and authority strategy to Google even though there are no current patents or research to indicate that Google or any other search engine uses a hubs and authority technology from 1999 in its algorithm. It’s not relevant to how search engines rank websites today. So why do SEOs continue to promote this idea without linking to any patents or research?
3. Keywords in H1, H2, etc
Keywords in headers are not relevant to how modern search engines work today.
Google and Bing are not matching keywords on a web page to keywords in a search query. The SEO industry likes to have it both ways by talking about entities and natural language processing while still clinging to an idea about keyword pattern matching that’s straight from the 1990s.
Don’t believe me? Search with Google and see how many of your keywords are in the web pages of the top 10 ranked web pages.
Those are three examples of common SEO strategies that are based on old and outdated algorithms. I demonstrated how to understand false SEO strategies. Now go out and build your own SEO BS Detector!
This post originally appeared on martinibuster.com and is republished with permission.