A web publisher asked why an old outdated site is outranking a newer “fresher” site. The common SEO answer is because the old page has accumulated trust and that age plays a role. Google’s John Mueller’s explanation contradicted that common theory. The reason why some old sites continue to rank turns out to be more nuanced than the simple age and trust answer.
Theory of Why an Old Site Continues to Rank
The web developer shared his ideas about why he thinks the site still ranks with outdated content, content that in his opinion is thin, all on HTTP.
“So I’ve got a couple of theories about it. Part of it is, I think, maybe it’s been in the index so long it… kind of has a trust factor built up with them.
I also think that age might be part of the problem of trying to provide that newer fresher content. …in most cases what we have done over the last year is a lot more thorough that what was written say ten or twelve years ago.”
The web developer observes that in his opinion, sites that use HTTP tend to haven’t been updated in two to three years. So he regards sites using HTTP “like they’ve almost been abandoned” and the HTTP as a signal of a lack of freshness.
HTTPS is a “Soft” Ranking Factor
Google’s John Mueller answered that HTTPS is relatively not an important ranking factor:
“HTTPS is a ranking factor for us. But it’s really kind of a soft ranking factor. A really small ranking factor.”
John Mueller then followed up on the web developers underlying contention that his content should rank higher than the older site because it’s fresher.
“…freshness is always an interesting one because it’s something that we don’t always use. Because sometimes it makes sense to show people content that has been established.
If they’re looking at… long term research, then some of this stuff just hasn’t changed for ten, twenty years.”
John Mueller is stating that some content is evergreen. Evergreen means that certain topics don’t change very much.
For example, the process for boiling an egg has very likely remained the same for thousands of years. While new techniques or tools may evolve, the basic content continues to remain relevant.
Then John Mueller offered this:
“It can really be the case that sometimes we just have content that looks to use like it remains to be relevant. And sometimes this content is relevant for a longer time.
I think it’s tricky when things have actually moved on, and these pages just have built up so much kind of trust and links and all of the kind of other signals over the years where like well it seems like a good reference page.
But we don’t realize that… other pages have kind of moved on and become kind of more relevant.
So I think long term we would probably pick that up. But it might take a while.”
It was surprising that John Mueller referenced trust. Googlers, including John Mueller, have consistently pushed against the idea that Google uses any kind of metric called trust.
When a Googler references trust, they are usually referring to a wide range of signals. For example, high quality links that continue to accrue to a page can be a signal that a page is relevant. That an older site does not appear to be adding external links (like paid links) to old pages can be a signal that the site continues to be on the level.
When a Googler uses the word “trust” they do not literally mean a metric called trust.
The web developer picked up on John Mueller’s use of the word trust and responded that he had the feeling that the outdated site’s ranking success had something to do with long term trust that the page had acquired.
John Mueller answered:
“I don’t know that we’d call it trust or anything crazy like that.“
That response is consistent with what every Googler and Google search engineer have said about trust. There is no metric called trust that gives a site rock solid ranking power. It does not exist.
John Mueller went on to say:
“It feels more like we just have so many signals associated with these pages. And it’s not that, like if they were to change, they would disappear from rankings.
It’s more well, they’ve been around, they’re not doing things clearly wrong for as long a time. And maybe people are still referring to them, still linking to them. And maybe they’re kind of misled in linking to them because they don’t realize that actually the web has moved on.”
So right there, John’s referencing possible signals like new references to the old web page and new links being created, all to signal that this web page continues to be popular with people.
That the page is relevant is an inference from the popularity. Popularity is not always a signal of relevance. I believe Google’s John Mueller pretty much acknowledges that in his answer when he said, “And maybe they’re kind of misled in linking to them.”
John Mueller continued discussing evergreen content:
“I think it’s always tricky because we do try to find a balance between… showing evergreen content that’s been around and… being seen more as reference content and… the fresher content. Especially when we can tell when people are looking for the fresher content, we’ll try to shift that as well.”
- HTTP is a weak ranking signal, what John Mueller called a soft signal.
- Freshness isn’t something is always used for ranking.
- Links to a page might be a signal that the page continues to be popular and by extension, relevant.
- Some content is viewed as reference quality and evergreen and thus deserves to rank.
- If searchers signal that fresher content is more satisfying, Google will respond with fresher content.
- Old sites do not continue to rank because they are old
- Old sites do not continue to rank because they have accumulated a vast amount of a metric called trust.
Watch the Webmaster Hangout here.
- Increasing SEO Traffic by Updating and “Thickening” Your Old Blog Posts
- Old-School SEO: 6 On-Page Optimization Techniques that Google Hates
- Safe or Risky SEO: How Dangerous Is It REALLY to Change Your Article Dates?
Screenshots by Author, Modified by Author