SEO

Google’s Over SEO Optimization Penalty

12 1 Googles Over SEO Optimization PenaltyIf you haven’t already heard, Google’s Matt Cutts and Bing’s Duane Forrester recently did a SXSW session with Danny Sullivan where they addressed questions from the audience. About 1/3 of the way into the audio though, Matt let slip a comment about an upcoming rankings change designed to target “Over Optimization” in an attempt to level the SEO playing field.

You can Listen to the audio here.

About 1/3 of the way in Matt says this:

What about the people optimizing really hard and doing a lot of SEO. We don’t normally pre-announce changes but there is something we are working in the last few months and hope to release it in the next months or few weeks. We are trying to level the playing field a bit. All those people doing, for lack of a better word, over optimization or overly SEO – versus those making great content and great site. We are trying to make GoogleBot smarter, make our relevance better, and we are also looking for those who abuse it, like too many keywords on a page, or exchange way too many links or go well beyond what you normally expect. We have several engineers on my team working on this right now.

So what exactly is this new change?

Truthfully, nobody knows quite yet. Many are expecting Cutts to shed more light on this at SMX Advanced this summer however it’s quite likely that the change will happen well before then (if it hasn’t already happened.)

There’s a lot of speculation on the various SEO forums about what’s involved, and if I had to venture my own guess I’d say it probably has to do with ranking factor correlation. I know that’s broad but there’s several directions Google could go with this.

Matt mentions too many keywords on a page, but I would assume that Google is already pretty good at detecting things like keyword stuffing and extreme link building so there’s probably slighty more to it. But what?

I’ve written about my theory of sustainable SEO before, and if you’re coming to SMX Toronto I’ll be speaking about it in depth so I won’t go too much into it here. Basically, Google’s top priority is making sure search results are useful to the user. True SEO success comes when your site adopts that same goal. If you’re focusing on making your site add more value for users and doing so in a way that’s easily crawlable and shareable you should have no problems with this change. Sadly, many of us often forego usefulness in favor of quick results. That’s probably what this change is designed to catch.

So what could it be looking for? Well, remember when I said correlation? One thing they might be doing is comparing various ranking factors. If your site is gaining tons of links but nobody’s sharing it or tweeting about it then that’s probably not natural and might be a signal. That’s my guess.

Enough about my guess though…

What do you think it’s about?

I’d love to hear your thoughts in the comments. What do you think this change is really about? Do you anticipate your sites benefiting from it?

 Googles Over SEO Optimization Penalty
By day Ryan Jones works at SapientNitro where he's a manager of search strategy & analytics and works mostly on fortune 500 clients. By night he’s either playing hockey or attempting to take over the world – which he would have already succeeded in doing had it not been for those meddling kids and their dog. Follow Ryan on Twitter at: @RyanJones or visit his personal website: www.RyanMJones.com
 Googles Over SEO Optimization Penalty

Latest posts by Ryan Jones (see all)

Comments are closed.

48 thoughts on “Google’s Over SEO Optimization Penalty

  1. Ryan, What you are saying about correlation is probably close, if not spot on. We had a conversation here recently around link bait and how social has changed it. I’m also wondering if link types and their value is a part of it.

  2. I’ve been waiting for Google to catch up on the over-optimization of top-heavy anchor text links. If every backlink you have uses the anchor text “SEO Company” that’s quite unnatural. Could be taking that into consideration too. It’ll be interesting to see how it plays out.

  3. If I had to venture a guess as to what he’s talking about in terms of “over optimization” I would say this update is designed partially to target too many inbound links with exact match anchor text…

    call it a hunch

  4. Concept of over optimisation has been around for quite some time. Links bleeding any form of authority to a destination site would be closely monitored for anchor text variance and a number of factors that would naturally or un naturally occur.

    Omho, It does seem that when there are currently are some small adjustmets afoot, focused on over optimisation.

    google first switched on over optimisation updates 5+ years ago and they have been running ongoing adjustments to it ever since. Much like many early updates were met with webmasters bleating about them until the point where google watered down all their updates into incremental adjustments like we see today.

    Over optimisation of onpage is often a “mom & pop” or “little league” sites making seo mistakes. Highly doubt matt and the team want to be beating down the man in the street for trying to play the game.

    So yes, im with jeff “if every link says seo company” ( thar be dragons ).

  5. I guess the comment about having all anchor tects on a website pointing towards on url or several urls but with similar keyword and vice versa shall be considred as over optimization. After all content has to be present as much as the webworld is linking to it. Lets wait and watch.

  6. I probably should have mentioned it, but another theory I have is that Google may finally be going after sites who refuse to link out but instead only link to themselves several times throughout all their articles.

    I really hope that’s part of it.

    1. Ryan are you referring to site links like footer or links within articles on a blog?

      Cause if it’s the latter then I’m definitely with you.

      1. Articles within a blog. I’m referring to sites (won’t name any by name.. think TECH industry) but ones who will do an entire story about a company without linking to that company, instead all the links are to their own coverage of that company. They’re basically black holes who only link to themselves and horde pagerank.

      2. Aha,

        Well if that’s the case here then it should really only apply to sites that are abusing that tactic.

        There’s nothing wrong with bringing attention to past articles you’ve written if they aren’t garbage. That being said I think sites should link out a much as they link in. And NO I’m not referring to arbitrary links to wikipedia…of course.

      3. I’ve read sites and blogs doing that stuffs. Well, Ryan is right its not wrong to link to past articles if they aren’t garbage but I hope they don’t do that often. Instead they link it to other sites that’s relevant to the anchor text they have.

        I think whatever this change may be will be a good one. A lot of people were already abusing some tactics for SEO these days just to get ranked in search immediately.

  7. The problem is there are to many sites out there that are still taking advantage of people to generate revenue. This cuase low quailty and not very helpful sites to come to the top of the list. If you take a well respected website that is trying its best to offer good content that is helpful and clean it should be near the top. This is going to benefit those site that try to offer genuine information to there users on a consistent basis

  8. Isn’t there already some kind of penalty against all inbound links having the same anchor text though? I recently did some work on a site that had hundreds (if not thousands) of spammy inbound links with the same anchor text. It didn’t matter what I did I couldn’t improve their ranking for that phrase. I assumed it was down to their shoddy link profile.

  9. I applaud Matt’s team for working on this as it’s a never-ending game of cat & mouse. Rewarding those who build great content that is shared is an awesome idea but will likely just change how the optimization process goes. Google will change (this is a constant), then there will be people that know how to crawl and assess these changes and adapt. At the end of the day it will be good for some time, then a new pattern/game will emerge. eg; Create content, share it, then instead of link bombing it will be about ‘acquiring’ re-shares/likes/tweets, etc. I don’t think Matt’s teams efforts are futile, but anytime you create a new pattern…there will be people that will assess and adapt to this new pattern. I’m guessing while ‘on page’ tweaks will be done, there seems to be writing on the wall that a ‘social graph’ will be more important that the good old ‘backlinks with anchor text’ game.

    1. I have excellent content in may sites, all of them mechanical engineering sites; I separated in subjects, worked for years and the funny thing is, one goes down 20, other ones goes up 30 ranks, two days later the one went down is up and the one that is very good (as a site) is gone…. 20 days later everything is upside down again.

      In a regular search engines, Ask, Bing, Yippy etc. I am on the first page all across the board, no hoops no problems, just very good content.

      I thing Google is shaking the field to force people to use pay per click. $$$ is the name of the game.
      Did you try to search on Google? in the organic area you will find just garbage…

      1. The latest round of changes in the Algo over the last 8 or so months seem to be pointing to exactly that. Google is making it tougher to rank, and turning its serps (inadvertantly perhaps) into an autocracy.

        I agree 100% that ads and revenue is becoming the primary motivator, and that this trend will continue until another search engine comes along and shakes up the world of search much like google did. Remember when google launched, there were no ads, but rather just organic results at the top of the page. This is one of the reasons people migrated en masse to Google from Microsoft and Yahoo search (plus quality results of course). Now, the top three results are ads.

        They’re either hoarding cash/revenue for something big, or they’re forgetting their roots. Time will tell I guess.

  10. I’ve written about this very topic many times before, and I think your right they already are good at detecting keyword spamming and extreme link building, BUT… They’ve never spent a particularly good amount of time cataloging what TYPES of links are easy to get (blogs) vs hard endorsements. Or Directories & Bookmarks vs press links etc. Corelate in content links on higher PR Domains on heavily linked to material with domains that already have a certain amount of clout and you’ve got a recipe for TRUST.

  11. Any tweak they do, or new system they implement….will have the ability to be gamed and exploited. I believe ‘people’ will become the new ‘links’….but even people can be bought or manufactured. Sad times for the search engines in my opinion.

  12. Create great content and work to establish and promote your company as THE authority for what you do in the markets you do it. Do this and you’ll do well (maybe not the best, but well) in the organic rankings and you’ll (in theory) never have to worry about future proofing your SEO or any over-optimization penalties. This said, I wonder how extreme Google will take it? It’s going to see how it plays out – that’s for sure!

  13. Totally agree with you Ryan now a days Google is bringing new thing after every 2-3 days..
    But what if you are doing SEO for a website from last 1-2 years and its not showing results as good as we want then we should continue with the SEO ?

  14. Google is always updating and changing their algorithm for better results and a better user experience. They also find ways to make sure no-one games the system and can have a upper hand. If you are a good Search Engine Marketer and keep up with the latest info and follow the “Google” rules you should have no problem producing unique content that is shared by many and you will see your pages rise to the top of the search engine results pages from those natural links that are created.I suggest you start optimizing for Google’s latest Semantic search and start creating pages with data from MetaWeb to keep up with the latest change in their algorithm.

  15. Yeah Ryan You are completely right totally agree with you

    @ Tarun Saini if you are not getting the results then don’t stop your SEO practices but try to pay more stress on content,Because Now a days if your content is king then only your SEO will be queen

  16. As an internet user Google’s search results quality is very poor this days.

    As a SEO, I think Google loves Google, and they love to make money.
    There is nothing wrong with this equations; Is a free service, nobody is forced to used.

    Today, as a search engine, their results to the “customer” es very poor, is just a matter of time they become like any other “gone for good” search engine.

    I don’t understand how they can “level the field” bringing to the user such a poor results….They are just finding the way for the SEO’s to use pay per click only, they don’t realize not all the people are willing to pay or their pay per click don’t bring ROI.
    I meet 31 people this this moth that they are not search in Google anymore… Go figure…..

  17. I think Google still tries to simply bring the user the most relevant results.

    With the number of banned Google Adwords users who have spent a lot of money (even $100,000 plus), I disagree with the statement they want people to use ppc.

    Good relevant content is what the internet, to Google, is still about in my opinion.

    DL

  18. I assume that Matt was referring to the Link Wheel tactic (over optimization) which goes unnoticed and many take advantage of the same, manipulating Google’s link scheme.

  19. “If you’re focusing on making your site add more value for users and doing so in a way that’s easily crawlable and shareable you should have no problems with this change”.

    You posted the paradox right there. OK, so Matt tries to go for the “Utopian” search, only naturally occurring, quality results, from objective sources, without over-optimization.

    Yet, they have to rank amongst themselves. Big business has a Big advantage, and the “little” guy, who isn’t producing the brand, but advertising it, will be stuck. Unable to provide “original” content or go over the brand in SERPs, because Google might consider his optimized site not natural enough, but made for the purpose of promoting said brand. Google earns top $, but still not enough on paid search, so now people complying with the algorithm “too much” will be penalized. Biased much? That’s too arbitrary, considering a site “over-optimized”. I’m sorry for the sarcastic approach, but if tomorrow Google, much like Chuck Norris will count to infinity (twice) I won’t rush to subscribe to their approach to business.

    They took on the yoke of being the Search Engine and the Search part is going away faster than you can say “Ads!”/

  20. There are so many elements involved but folks seem to be missing one of the most important ones: the presence of common ad-server coding. In one niche the SERPs are led by Wikipedia with maybe 10 lines of regurgitated data from the US Census Bureau surrounded by a bunch of internally massively duplicated links and other text. Then comes city-data with 500 lines of actual data, but the top of the page is virtually always occupied by two large AdSense ad units (and often enough major layout problems because of the two ad units). The next 20 results will be incrementally degraded copies of the Wikipedia regurgitation with a reducing mass of internal links. Anything that might be original about the place in question might start to appear on the third or fourth page of results. It has gotten better since they inserted the “official site determination” module but still…

    For all intents and purposes, it really looks like the single best foot you can put forward and reduce the number of penalties is to simply remove all common ad-server coding from a site (because that’s a major common denominator between Wikipedia and government-financed sites). That doesn’t mean the end of advertising, it does mean AdWords, AdBrite, OpenX, Contextweb, Infolinks, DFP and the hundreds of other ad networks (and servers) out there that deliver their ads to you via javascripted coding that makes a call to any of the well-identified common ad-servers is a definite strike against you. Build your own internal ad-server and route everything through that, you might have a better chance of keeping your rankings. That’s one thing the big boys have going for them… And I know, I know, folks will point and argue and prognosticate that Google never said they would penalize sites for running ad-server coding. They also never said they wouldn’t reward sites for not running it. To go a step further: any site that finds a way to monetize itself other than through the general purpose ad-servers has obviously impressed someone, somewhere, enough that they have invested some of their hard-earned dollars in advertising on the site. That alone should denote some hint of better informational/traffic quality than sites plastered with common ad-server codes surrounding a hint of text in the middle…

    1. You presented in a very simple way, and make sense.
      I just remove the only tracking code I have in this web.
      I am going to experiment with other sites and I will get back to you. Good thought! Thanks for the idea.

    2. As I was cleaning the ad-server coding and deleting 3 site from goggle control I notice this:

      Webmaster Tools Sites 79 total – 22 verified

      I registered with them only 22 sites, and they know I have more (79).

      Google connected me with 79 other sites…
      how I can disassociate myself with those sites?

  21. Well the new Google penalty for over optimization has already hit one of my sites. It went from Number 1 for hundreds of keywords to not being found anywhere. Webmasters beware, it might not hit your older websites but if you are launching a new website, be careful with your on page optimization.

  22. I think it has to do with the upcoming Semantic search technology they’ve been talking about releasing. With Semantic search, the Googlebot will be looking for other relevant content found on the page that matches the users intent and relevancy in context, so it won’t matter if you have the same keyword used multiple times on the page, it will not make it any more relevant.

  23. I have read many article on the new upcoming Over Optimization Penalty. My question is – Would this be a site wide penalty or page wise penalty?

    There may be many sites that are having few pages overly optimized on-page SEO while rest don’t. There may be many sites that are having many incoming links coming to their few pages only but rest of pages on their site are having very less incoming links.

    Would this penalty will affect the whole site or the pages that are indulging in over optimization both on page and off page; on page or off page?

  24. There might be an “upcoming Over Optimization Penalty” but I dunno… the one we’re looking at here has been in the experimental stages for more than a year now. Google engineers tweak things all the time, usually with some hint of direction to their tweak. Often enough, the results of any particular tweak turn out to be opposite to the original intent. Do they back up and begin again? Not often, usually they just reduce the weighting of a couple variables and see what turns up. Eight or ten months after the original tweak went live, you’ll see a new post in the Webmaster Central blog crowing about a new feature… and the lemmings will rush ever-closer to the edge of the cliff. What Google doesn’t say is often more important than what they do say.

    To Vineet: if those are truly your concerns, you just might be screwed. Generally Google only penalizes pages, until the number of penalized pages adds up to 60% or more of the pages in the sitemap(s) you submitted to them. If you didn’t submit a sitemap, they go by what they have in the index with your name on it… At 60% and above, they bring out the nukes and just generally wipe you.

    Next we’ll be hearing about penalties for using “state-of-the-art” coding. The various runs of Panda have made it clear that Google can’t “see” quality, but they are learning to know (and program for) different aspects of what quality isn’t. One major problem that has popped up is seeing original-and-unique content creators demoted below the scrapers because the scrapers use more “state-of-the-art” coding. We have long been pounding on this particular “table,” since well before the first Panda run. We were assured directly it wasn’t a problem, then the first Panda run highlighted how bad that problem is big time – because it was such an obvious reward to the scrapers and copy-pasters using state-of-the-art CMSes. So a lot of the “tweaks” we’re seeing these days are Google engineers pecking away at some of the smaller blocks of that whole. “State of the art” coding still leaves a lot to be desired and offers many more places for “programmers” to attempt to game the systems. Sometimes it’s just easier to do things right from the get-go. It sure is more fun… and it lasts longer… there’s just one problem: you must have something that is actually of objective value, is in-demand by some reasonably-sized subset of humanity and is at least some small contribution to the sum total of human knowledge, education and/or experience.

    Couple these observations with my comment above in regards to “common ad-server coding” and you’ll get a glimpse of where it’s all most likely headed. You might also see why “Presence of AdSense” is usually in the #2 slot in most lists of “10 Ways to Determine a Crap Site.” (#1 is usually occupied by anything “SEO” related)

    PS: I once asked a Webmaster Trends Analyst face-to-face if they were targeting sites monetized particularly by AdSense. The paraphrased answer: “No, we are not. If we can dump the crap sites from the SERPs, we kill their monetization. Kill their monetization and they die, because they have no other reason for existing. So the target isn’t just AdSense… the target is virtually every site that runs any kind of common ad-server coding. If the only reason for their existence is to try to make money from the display of those ads, they’re history. Sites must have some other discernible reason for being.” So the next ring around that bull’s-eye is the games people play in trying to draw more traffic…

  25. Im am betting over optimization penalties will have to do more with off page link building then on page optimization. I mean we are already seeing that anchor link variation plays an impact on SERPs. So hopefully those crummy spammy backlinks will get you some sort of penalty.

    I would love to know what kinds of onpage optimization might trigger a penalty….hopefully nothing I currently employ on my site?

    1. Thomas,

      I’ve already seen first-hand websites drop in rank that are employing the link wheel and spammy backlinks techniques.

      With that said, It’s in my opinion that pages that are overly keyword stuffing will be hit as well.

      So….I can safely deduce that:

      (Off-site) Shady Backlinks + (On-site) Excessive use of same Keyword Phrase = Penalty

      To me, it’s nothing new, BUT I think that what Google considers a “Shady” backlink is what has been enhanced in this update.

  26. Does anybody else feel like a dog chasing their tail? There are so many hoops to jump through to get your site rankings up, and then the rules are always changing – we have to keep up or get left behind. Isn’t all of this Google ranking stuff just keeping us from actually spending time creating great content? It seems the system isn’t working out too well if it’s so easily abused.

  27. I think it comes back to keeping it real. If you get that funny feeling in you tummy, that you might be getting some unfair advantage by what you are doing, then you probably are going to be affected by the penalty. If we are to be more scientific about it however (more so than talking about funny feelings in your tummy), you really need to wait and see how it affects you. Unfortunately we never REALLY know how an algorithm change will affect the search engine results until it actually occurs. But common sense prevailing, you would expect Google to go after spammy black hat online marketers, and if you have kept it real with good engaging content, you should be fine.

    On a separate note, as we all know, within days of the changes, SEO guys will be looking for loopholes. As soon as a large percentage of the SEO population starts to use those same technique (which ultimately leads to issues in the user experience in Google)… guess what… Google will change the algorithm again. And fair enough!

    Personally, i’m glad it’s happening because it’s going to create a whole new need for engaging content, and not just random dribble we seem to be getting nowadays from search engine results.

  28. I think we just need to abide by G rules and yes adapt whatever changes they make no matter how prostrating it is after you did all the work that used to work. it looks like recommending PPC is a good strategy to keep the flow of traffic to any website.

    Thanks for posting this article and to everyone who shared their thoughts and knowledge about the issue. Truly nothing can beat quality content…at least for now.