It seems as if links don’t rank sites in the same way as in the past. A good batch of links no longer guarantees you’re going to rank for a phrase. Also, it’s possible there are other ranking factors that are muting the influence of links. The following represents my opinion. Feel free to disagree.
I have for several years been under the impression that we are in a search environment in which the influence of links has been diminished. Links still matter, but content has , in my opinion, gained increasing influence in what ranks.
First Barrier to the SERPs: The Core Algorithm
A regular question on black hat groups is about the Pop and Drop. The Pop and Drop is when a site acquires a set of links and they pop into the SERPs only to later drop off. The links were devalued. I believe that’s the Penguin algorithm working in real-time as part of the Core Algorithm.
When I say real-time I don’t mean in literal real-time. There’s a lag between the indexing of the web and the recalculation of values across various SERPs. It’s referred to as real-time by Google.
Second Barrier to the SERPs: Content Analysis
Bing is cool because it’s relied on content more than links. Links play a role but in my opinion, the role is limited to validating whether a site is authoritative, useful and trusted. This is an important point.
Trustworthiness and Links
That’s what’s left when you clean the link signal of anchor text spamming, a signal that indicates trustworthiness.
To what extent should Google give weight to anchor text? There is evidence that Google might use the text that surrounds the anchor text in the way Google used to use anchor text.
SEOs are quick to put down Bing but are quiet when it comes to answering how to rank on Bing. The reason is because Bing has focused on understanding content and user preferences while SEO has focused on rote keyword and link building strategies for Google. But those rote strategies don’t work anymore. Google’s been increasing the ranking power of on-page and user preference signals. Like Bing.
How Should Link Building be Approached?
The takeaway for link building, in my opinion, should be to focus on proving trustworthiness and making sure the machine understands what niche our web pages fit into. The way to communicate trustworthiness is to be scrupulous about what sites you obtain links from but also, be super careful about what sites you link out to.
For link building, I believe that it may be even more important now that the page your link is on has relevant content on it. Also, make sure that the outgoing links are relevant as well. Not just from that web page, but from the entire site. What good is a link from a good page if every other page is linking to low quality sites? A site like that will have it’s ability to pass PageRank removed. This is how the reduced link graph works.
Can Google Identify Paid Links?
If I were to speculate, I’d say that Google might use outgoing links as a major factor for identifying paid links. Not just from a single page but from the entire site.
News websites that publish articles from contributors who are selling links under the table are outlinking to a class of sites that put those news sites into negative link circles.
This, in my opinion, results in links that do not pass PageRank.
There are various algorithms that demonstrate that re-classifying sites as spammy based on outlinking patterns works. Some of these are trust type algorithms, anti-trust algorithms and double seeded anti-trust algorithms that re-classify sites and create a reduced link graph.
These kinds of algorithms, which are similar to the Penguin Algorithm, prevent the propagation of PageRank. Read: Link Distance Ranking Algorithms
There are situations where the rankings based on links are overruled. The reason is what’s known as the modification engine. The modification engine is a set of algorithms related to personalization. Geography and previous searches can influence the kinds of sites you see.
If the majority of people who search with a given query are from a specific geographic area, Google might even pull a page from page 2 of the SERPs and rank it near the top IF the algorithm determines that it may likely satisfy users who are from that geographic area.
TAKEAWAY #1 – Links for Inclusion
Even when the modification engine kicks in and the core ranking factors are set aside, links still matter; they just don’t matter for ranking in this scenario. They matter for inclusion.
In order to stay in the SERPs it’s important to think about the outbound links on your site and the sites you obtain links from. Think in terms of reduced link graphs, with non-spam on the outside stuck within their own cliques and the non-spam on the inside within the trusted Reduced Link Graph.
In my opinion, you must be in the trusted Reduced Link Graph in order to stay in play.
TAKEAWAY #2 – Lose the 200 Ranking Factors
Your job as a link builder is to determine the exact reason why Google is ranking a site in positions 1-3.
It’s not always links or on-page content. Then update your strategy to conform with that reason. Do NOT seize on the obvious reasons.
Remember, just because a site has backlinks from dozens or hundreds of “powered by” footer links does not mean those links are what is powering those rankings.
In modified SERPs, traditional ranking factors are set aside. So those “powered by” links are NOT what is powering those rankings.
That’s why I believe it’s important to stop thinking in terms of 200+ ranking factors when trying to diagnose the reason why certain KINDS of sites are ranking in the top three positions, the next three and so on.
If obvious ranking factors (on page and off page) do not make sense, stop looking at them and start considering the ranking from the perspective of the Modification Engine.
Failure to set aside 200+ ranking factors from your diagnosis may cause you to miss the real reason why a site ranks.
For example, if you conclude that footer “powered by” links is the reason why a site is ranking, when the real reason might actually just be the user intent inherent in the content, then you will miss the real reason the site is ranking.
And with that, you will miss the opportunity to figure out a way to defeat the competitor.
If Google is preferring sites from a geographic area, focus on getting links from sites that are tied to that geographic area, especially if it’s in the name of the site, on the web pages or in the Whois registration data.
Put a page up with the name of that geographic area and build links and non-link citations from sites based in that geographic area.
Sometimes Google prefers educational, scientific or informational sites. Make the determination. Then incorporate that finding into your link building strategy.
I think it’s useful to stop thinking in terms of 200+ ranking factors. I believe it’s increasingly important to do competitor analysis and content creation from the perspective of understanding what users want as evidenced by what Google is ranking.
The new reality is that you can not earn, build or buy enough links to crack the top two, three or five if the page you are building links to is not the same kind of page that currently ranks at the top.
Did Google Win the Link War?
There is another issue affecting the usefulness of links and that’s how links are counted and not counted.
Consider this: It’s been a few years since any significant research has been published about combating spammy links or propagating non-spammy links from one legit site to another legit site.
Most of the research being done today is on understanding content and understanding user intent. It’s almost as if the link war has been won.
Does that mean the search engines have you beat?
Not in my opinion. Google is focused on users. It makes sense that a practical publisher is going to focus on what users want, as well.
Do SEOs Focus Too Much on Links?
Googlers have said that SEOs focus too much on links. I tend to agree. That belief in links is what drives the commerce in private blog networks and paid link schemes.
Links are important. But creating the content that earns those links is very important. There is so much focus on understanding search queries and understanding web pages, it just makes sense to go back to basics and focus on understanding what users want (understanding search queries) and how that relates to on-page content.