Happy new year everyone! Welcome to 2011 or MMXI to all of you Romans and movie producers! 2010 was a mixed year. The economy continued to sputter along, some things were quite memorable, others forgettable.
In my last article for 2010 here at SEJ, I documented some of the ways in which 2010 was an epic year for SEO. In that article, I invited readers to offer up predictions for what you thought would be epic changes to the search marketing arena in 2011, with the promise that I’d include those I felt were right or alternately, way off base, in an upcoming article, which makes for a perfect first article in 2011.
My Inevitable CaveatUnlike a plethora of hacks who claim they’re “good friends with Matt Cutts”, I only know Matt casually – having pummeled him early on in my Twitter life until he gently guided me to finding a better way to get his attention, then went on to meet him at SMX Advanced last year, and snap this photo of Matt getting ready to win some billiards in a game against Michael Martin…
So it’s not like I’ve got the inside track when it comes to my prognosticating – these are just my own beliefs, and thus I can’t be held liable if you take what I share here and end up having your sites end up losing massive positioning in the SERPs 🙂
So now that we’ve got that out of the way, on with the predictions, and my take on whether they’re accurate or way off base…
Prediction: TBPR dies, MozRank, DomainAuthority become the Nielsen Ratings of Link buying
Kris Roadruck went out on a limb with what has to be either extremely visionary, or totally wacked 🙂
Toolbar PageRank dies permanently (probably already did people just dont want to admit it until after the new year goes by without an update), SEOmoz’s MozRank & DomainAuthority becomes the new way to value link sales. Market adapts, Googles plot to foil link sales falls flat on its face, life goes on.
Prognostication: Maybe Sure, I’d love to see the TBPR just die finally. It’s a bogus number that has no true capacity to help indicate high relevance or quality targeted clicks. The stuff of “guru” SEO. So out of shear hope, I’ll agree that it finally gives up the ghost this year.
Now – the concept that MozRank and DomainAuthority become the defacto standard by which link buying and selling is an interesting one indeed. In fact, I pray the current model (Alexa, Compete, and other predictive analytic systems) for determining the price and value of link sales is pure fantasy.
So from that regard, it would be refreshing to see MozRank and DA become the new model. Except I think it would take a massive marketing and advertising initiative on Rand’s part to get enough main-stream players to even consider the notion. And let’s say he can pull that off – it would probably take a couple years to turn the tide.
Prediction: The Death of Keyword Anchor Text as a factor
This one came from Ian Williams. He then went on to clarify:
I just don’t see it as a realistic indicator of quality. Topic-relevance, sure, but not quality; they’re two different things and I expect/hope(?) Google to make a clearer distinction between the two in 2011.
I think of sorting the web like sorting a massive library, e.g: I read a book called ‘The Hungry Caterpillar’ as a 3 year old…by title and superficial references its very relevant to someone interested in insects, but its hardly ‘Origin of the Species’ is it? 😉
Relevance and quality should be treated distinct from one another – or, at least, searchers should have the option as to which the SERP is prioritised for! Ideas for sorting by quality could include factoring social signals in.
While Ian’s underlying concern and perception regarding the importance of discerning quality vs. topical relevance is very astute, there’s just no way anchor text is going to be completely eliminated in 2011 as a factor.
Here’s a reality – topic relevance will always be a factor. How that’s determined will always be a moving target, yet for now, comparing the content from the source page to the content on the target page comes down to the words on the pages.
Of course, Google has worked to modify what that means and how they rate it – to the point where if all you have is exact match anchor text now in all your links, you’re now subject to phrase level penalties.
So you need a variety of words in your anchors, where for a while you could have gotten away with exact phrases as your only method.
But to completely eliminate this factor is not possible simply because Google has too much invested in them being a factor for too long. They can’t so easily extricate themselves from this. It’s going to take a series of additional tweaks and changes around weighting value, over an extended period of time, IF they go for complete elimination.
Joshua Titsworth also provided his own take on this prediction in an article this week, where he believes User Experience is a reason to continue using such anchor text even if it’s eliminated as an SEO factor…
Prediction: Google Will Try To Pulverize 3rd Party Signals
In this scenario, Ramsay Crooks says:
I predict local factors and Google attempting to control how it handles real time content with social sites of its own (HotPot) could foreshadow Google’s new future plans by integrating more community into search… and trying to do it without relying on the databases of Facebook, Twitter, Yelp, etc…
Prognostication: Agreed (with a BIG FAIL when it’s all said and done)
The operative word in Ramsay’s prediction is “attempting”. Let’s face it. Google’s in business, based on free market rules. We all know that when free market rules are in play, until someone comes along with a better model, or the government steps in, companies will quite often do all they can to kill the competition.
And we all know now that Google sees Yelp as competition (HotPot anyone?). Yet given Google’s *success* with Wave and Buzz, (hahaha seriously – Google – ask more people before you roll things out okay?), it’s crystal clear that they’re highly skilled at halfway decent Search results and fall flat faster than a tire rolling over a landmine when it comes to social. At least for now, they do.
The bottom line here is I think they’re going to spend a fortune *attempting* to 1) get every human on earth to stay inside the Google box and 2) in turn, make sites like Yelp, CitySearch and the rest irrelevant. And they’re going to fail at their ultimate goal.
Prediction – SEO will now be defined as Semantic Engagement Optimization
Dana Lookadoo predicts
SEO as an acronym will need to change or simply become a generic term for optimizing for users across a variety of search platforms, especially social networks. We now have to focus more on engagement!
And as dollars and attention shift further online, I predict low unemployment rate for qualified professionals falls in our sector. 😉
Prognostication: yes and no
2010 definitely saw the march onward toward our one day needing to change the definition of SEO, if not the complete elimination of the acronym, as more and more factors go into the process of both getting more people to find your information and ultimately getting them to visit your site and convert.
So I think this is only going to continue in 2011. More effort than ever will need to be given to off-site factors, social media signals especially.
Except the very use of the word “Semantic” is going to be a roadblock, because it’s a $50 word that mainstream site owners and business managers are going to choke on.