Google Algorithm Update
Google

What I Learnt From Beating Google’s First Update of 2014

For those of us who track our own websites, you may have noticed a Google update roll out on the 8th of January that’s been causing large fluctuations in the SERPs since then. This was the 6th largest update in a year according to DejanSEO, and hundreds more can’t stop talking about it over at SERoundTable and WebmasterWorld.

Personally, a site of mine was set back nearly 130 spots. It dropped from rank 5 to rank 131, and over the course of 3 weeks, I continually tested in order to understand what the update had done. After some minor changes, I successfully pushed my site back up all the way to rank 5 again.

Here is a quick summary of the timeline of events:

  1. January 9th – Site was hit by the penalty and dropped from rank 5 to rank 124.
  2. January 19th – Investigations led to having a PR0 exact match anchor text link removed
  3. January 26th – Monitored the results and actually saw a further drop of about 60 places since the link was removed. Fell to rank 131 at this point.
  4. January 28th – Decided to change the URL to avoid using my exact keyword. 301 redirected the old URL to the new one so that existing link juice would flow through.
  5. January 30th – Huge jump back to page 1, woohoo!
  6. February 1st – Site climbs to rank 6
  7. February 4th – Site climbs further up to rank 5

See here for more details on the timeline of events on my website.

Thankfully I documented all my steps, and reflecting back, learnt some valuable information.  Let’s see what we can all take away from this.

Content Isn’t King

A distinction needs to be made for when content is actually king, and when it’s not. Too often people assume that if they have great content they’ll automatically rank for terms and never get hit by any penalty whatsoever. Wrong. I had a quality page up with beautifully crafted text and ShutterStock-worthy images (I even took these myself using my Girlfriend’s DSLR), and I still got hit. Unfortunately Google isn’t yet sophisticated enough to tell how well written your content is (although it’s getting there).

Content is king however for social sharing, building a brand, improving visitors’ time on site etc. I’m a huge fan of good content, but unfortunately it doesn’t make you immune to penalties.

301 Redirects Pass All Link Juice

Although Matt Cutts has already come out and publicly stated that all link juice is passed through a 301 redirect, I still see this question asked a lot. My site leapfrogged up nearly 130 spots to page 1 and there was no way this could have happened without having all that link juice flowing from my newly redirected backlinks.

301 Redirects Pass Link Juice Immediately

I was recently reading a blog which had indicated that link juice takes time to be passed through a 301 redirect. Based on what this test shows, this is also a myth. From the time I had setup my 301 redirect to the time I was sitting on page 1 was a mere 2 days. The juice from the redirect must have taken effect almost instantaneously.

The Update Was Not Penguin Related

Forums are oftentimes the first place to find breaking news. At the same time, forums are probably the best place to start a rumor.

A thread started brewing in the WebmasterWorld Forum with many people claiming a Penguin update. I tested this by having all exact match anchor texts removed from my site (luckily there was only 1, otherwise contacting all those webmasters would have been quite a task). As you know from looking at the timeline of events, I saw zero improvement, and in fact dropped a further 60 places a few days later.

Google is Lenient

I was quite surprised as to how significant the improvements were after simply changing the page URL. All this happened within a few days of making the changes- which is incredibly fast. What this tells me is that Google is actually quite lenient when it comes to this specific penalty.

Don’t Over-Optimize Your Header Tags

Only a few months ago was I seeing great results from having my exact keyword in header tags (title, h1, URL). Things change quickly in SEO though, and this test indicates to me that we want to now avoid overusing your keyword in the header tags. What seems to work better now though are variations of your keyword – almost as if you had no idea what keyword you were targeting. Almost as if, it came out naturally that way, which brings me to our final point…

Biggest Takeaway: Keep Everything Natural

If there’s one thing to learn from this, it’s to keep your site looking as natural as possible – almost as if you knew nothing about SEO. With the new Hummingbird update, Google is able to understand the meaning behind sentences, and because of that, you don’t need to worry about stuffing your keyword in every tag. Instead, use variations and synonyms of your keyword to create that natural effect.

I realize that it may be hard for some SEO’s to determine what’s natural or not anymore because sometimes it goes against everything we’ve learnt. So, here are a few more tips you can use to keep your site looking natural in Google’s eyes:

  • Don’t link hoard. Link out to others when it makes sense
  • Have a Privacy Policy and Terms page
  • Have a thorough About Us page
  • Have active Social Media accounts linked to your site
  • Have a branded domain name

There is a lot we can learn from this first 2014 update, and if we embrace all future updates with the same mindset, then we’d have no reason to be afraid of any algorithm changes in the future.

Author Image What I Learnt From Beating Google’s First Update of 2014

Hin Lai

Founder and CEO at Search Highway
Hin is the mastermind behind the digital agency Search Highway, and is someone who is never satisfied until his clients receive a positive ROI from working with him. When he's not busy with client work (rarely), you can find him hanging out on SEO news channels.
Author Image What I Learnt From Beating Google’s First Update of 2014
Author Image What I Learnt From Beating Google’s First Update of 2014

Latest posts by Hin Lai (see all)

You Might Also Like

Comments are closed.

33 thoughts on “What I Learnt From Beating Google’s First Update of 2014

  1. I have used this tactic on sites that have lost “google love” after the initial period. However, it does not always last.
    I would be interested to see if you keep your rankings or whether you drop again.

    1. As of today I haven’t noticed anything (some minor fluctuations between rank 5 and 6 and back). I’l have to keep you all posted on how things go!

  2. Hin,
    everything you covered was great – but I was wondering – what is your opinion on a rest. site or other smaller sites having a: Privacy Policy and Terms

    Much thank
    -Chenzo

  3. so it sounds to me like you had a penalty. Did you check WMT for a notification? If so, the 301 might have temporarily avoided the penalty but my guess is that it will come back once the algorithm catches up.

    All this stuff about changing 1 -link, i’m not sure what that is. I don’t believe only one link can cause a rankings drop of 100 places.

    1. Hi Ryan,

      WMT was one of the first places I checked to see if it was a manual penalty, but there was nothing. Removing that 1 link was a test to see if it actually had any effect. If you read my original post (linked in the article above), you’ll see that removing that link had zero effect – not surprising!

  4. This was a really narrow view on a complex topic. Figuring out a specific penalty you were able to overcome is hardly a great base for summarizing the ENTIRE update. If you browse forums, you will see that site owners describe various factors they have observed. Some of us got pages on the same site/domain that preserved the rank and pages that dropped due to preference for other sites on a topic (external factor), pointing to context analysis change. Many saw their sites pushed out by “official” sites (government, edu), e.g. sites with no ads. Thanks for sharing your experience, but please don’t rush to general conclusions using such small data sample.

    1. Hi Lilia,

      I don’t disagree that this requires a larger sample size, though I hope it gives some insight to others based on what I had seen. I also saw pages on my site that preserved rankings whilst others got hit. I haven’t quite figured out why some pages were affected, whilst others were not, but I’m doing some testing to figure that out.

      With that said, if you’ve simply been pushed out by other sites, then you most likely haven’t been affected by the update. You’ll know when you’ve been hit when you’re down 100 spots, as opposed to a couple.

  5. You are lucky, one of my website was thrashed very badly just before the new year by Google and till now i couldn’t recover….

    1. If it’s an algo penalty then keep testing and eventually you’ll get there! If it’s a manual penalty, it might take a bit longer. You might even want to consider a new domain.

      Good luck!

  6. It is difficult to draw any conclusions from this because you neglected to share important things like the URL, the keyword or anything that would allow people to see what really happened. For example, you say it was truly great content, but nobody thinks their own baby could be ugly. It may have been well written and nice looking, but was it the same type of thing that is available on hundreds of other sites?

    I think you might be jumping to conclusions throughout this. One bad link is not likely to trigger a penalty, unless you have lots of other bad links or particularly poor quality in the content or the site itself.
    Also, this “update” has not even been confirmed as a major one. Google makes more than 500 updates to the algorithm in a year. Yes, there was lots of fluctuation in rankings after the 8th, but that happens sometimes without there being a specific boogieman.
    Regardless, removing links usually takes weeks to have any effect.
    Changing the URL could help sooner, depending on how frequently google crawls the site.
    Given the relatively short time frame, and the volatility of search results after Jan 8, I think you have managed to attach some nice graphs and guesswork to what are simply random fluctuations. If you want to truly test it, put everything back the way it was, including that removed link. Then see what happens after about a month, not a few days.

    And think about this – how much time was spent obsessing over that one link, or whether using your keyword just a few times was too much? You could have spent that time improving your site in some way, or working to promote that page in productive ways like social media or outreach, assuming it is something that is good enough that others would want to share it. Or writing your next piece of truly great content.
    I know nobody ever considers that their content may not be the best, most useful, or even original in concept – but perhaps your initial good ranking was due to freshness and it was only a matter of time until that faded away. There is a reason people frequently say time is better spent improving what you can improve rather than chasing algorithms and links in hopes of making “OK” content appear to be more popular than it really is.

    1. Hi Barry,

      I did consider when putting this out that it could be disregarded because of sample size, testing time etc., and I don’t disagree that it isn’t a conclusive test. With that said, I think it could be something very useful for some webmasters, or at least give them some indication as to what they can be testing themselves.

      On that same note, I think any great SEO needs to do their own testing. This test took me about 60 minutes in total, which seems quite reasonable as I’ll be able to apply this knowledge to all my own and client sites. Unfortunately I don’t think ‘working on content’ is sufficient as there’s no measurement for good content. I think this is something that’s very difficult for any algorithm to decipher, including Google.

      1. If you seriously believe there is no measurement of quality, you should probably stop giving SEO advice. If you have been paying attention over the past 5 or so years, you would know that SEO is much more than counting keywords and getting links that only “appear” natural. Google’s algorithm updates in the past few years are all about quality, on page as well as off page indicators of the quality and value of a page.
        Panda – is all about quality. Penguin – lots of spam links = probably not good quality. Social signals – don’t typically happen without quality. Engagement metrics – do people engage with poor quality content? Sometimes – I am engaging with a well written but poorly researched article right now. Content algorithms – can determine readability levels, uniqueness and a variety of other things and interpret that data as a measure of quality. And then there are things we don’t know if they are in use yet: authorship & its possible influence, and consumer sentiment, for examples.

        That it “took about 60 minutes” is exactly why this is a waste of your time and misleading for anyone who thinks this will help. You determined almost nothing. Your tests were far too close together to determine which change had which effect, in addition to your small sample size. And in that 60 minutes, you overlooked hundreds of other possible contributing factors. You end up with “it could be this, it could be that, it could be something else… but I want to believe it was THIS one so that’s what it is”.

        When people tell you to think more about quality, this is what they mean. If you had done some solid research that was truly “quality” information and shared some actual data instead of just an unverifiable anecdote, people would link to it and share it far more than just those people who tweet everything they read and automated sharers.

    2. There has ABSOLUTELY been an update (more than one major one, in fact – I would say three). It is strange though that this seems like a “stealth” issue. I haven’t heard much about it but I have two sites myself and I know of half a dozen others that have been hammered since Jan 20th. These are ALL long-standing, good quality news sites that have been considered authorities by Google for years. What they have in common is that they are independently owned.

      Looking over SERPs where we used to rank, what I now see is independent sites replaced by large, corporate, general news sites – even to the detriment of the results. Example: If the average movie fan is searching for “Avengers 2″ do you think they really want articles from Forbes and Motley Fool?

  7. Rule for attaining good SEO are changing very fast with the passage of every hour. Some rule which were included in White Hat SEO are now included in Black Hat SEO. In such a situation it is very difficult for a newbie blogger to do the right and legal thing for his/her blog. But article like above always a treat and help newbies like me very very much. So thanks for a great article. Thank you very much.

    1. You’re absolutely right in the SEO changes incredibly fast. Something that may have been right yesterday could be wrong today. I was hoping to give some specific advice in this article that could be applied immediately, so I’m glad you found it useful!

  8. If you were hit by a penalty did you get a warning or a message, second all of the information I’ve read is all about quality link so this is quite confusing article.. sorry

    1. Hi Phil, there was no warning or message, it was an algorithm penalty rather than a manual one. This is just what worked for me and I’m hoping others can benefit from it as well.

    1. There’s definitely more obstacles in making changes with a company page, as opposed to a personal blog, but I think the key is to be natural, so I’m guessing you can change title tags and other on-page text that would give a similar effect.

  9. Thanks Hin. I think this is great information that could be useful. I see it has brought a lot of discussion and while I think you have addressed that not everyone may find this to be the answer for their drastic drop – this may help a lot of us. I look forward to more updates from you!

  10. Hin – While I do agree with some above that your solution is still in testing and only time will tell if it works on a permanent basis, I do applaud you for two things:

    1. Testing and trying something NEW. I think with all the Panda, Penguin and Hummingbird updates over the last 18 months, some SEO’s have started to shy away from this aspect of SEO and that’s a shame :(

    2. Sharing good practices – Your recommendations and conclusions are solid advice regardless of the test that led you to them. Over-optimizing anything these days (Meta Data, Content, Links, etc.) is not a good thing.

    Looking forward to a follow up post on the longevity of this test!

    1. Thanks Mike – so far my rankings are still holding up very well but I’ll be sure to come out with a follow up after giving it more time!

    1. Hi Manesh,

      It’s hard to say without seeing the links themselves, but I’d suggest checking Open Site Explorer or Ahrefs and checking the URLs these are on. I see this a lot when a backlink is removed from a homepage or sidebar that has lots of pages. Since there was a backlink coming from every page on the domain, the crawlers slowly see that the links no longer exists so they decrease over time.

      Hope that helps!

  11. @hin.
    If we are redirecting the old ULRs( overly optimized with anchor texts) to the new URLs,(all link juice passes through 301 redirects, will it not be causing the same issue in future as the link juice is being passed from the same anchor texts?