SEO

The Holy Grail of Panda Recovery – A 1-Year Case Study

Prospective client: Help! I’ve fallen, and I can’t get up!

Me: Where are you?

Prospective client: Massachusetts

Me: I’m 3,000 miles away! Call 911!

Prospective client: No, it’s my website! We lost 50% of our traffic overnight! Help!

Me: How much can you afford to pay?

Okay, that’s not actually how the conversation went, but it might as well have, given the fact that I came in and performed a strategic SEO audit, the client made all the changes I recommended, and, as a result, a year later they’ve increased organic traffic by 334%!

PandaRecoveryNumbers The Holy Grail of Panda Recovery   A 1 Year Case Study

Now, when you look at that data, you might assume several things. Ryan Jones saw those numbers and tweeted just such an assumption:

PandaRecoveryContentQuestio The Holy Grail of Panda Recovery   A 1 Year Case Study

Rather than leaving people assuming what was done to get such insanely great results, I figured it would make sense to write an actual case study. So once again, I’m going to ask you to indulge my propensity for excruciatingly long articles so that I can provide you with the specific insights gained, the tasking recommended, the changes made, and the lessons learned.

First Task – Look at the Data for Patterns

One of the first things I do when performing a strategic SEO audit is look at the raw data from a 30,000-foot view to see if I can find clues as to the cause of a site’s problems. In this case, I went right into Google Analytics and looked at historic traffic in the timeline view going back to the previous year. That way I could both check to see if there were seasonal factors on this site, and I could compare year-over-year.

That’s when I discovered the site was not just hit by Panda. The problem was much more deeply rooted.

The MayDay Precursor

PandaRecoveryMayDayPrecurso The Holy Grail of Panda Recovery   A 1 Year Case Study

The data you see above is from 2010. THAT major drop was from Google’s MayDay update. As you can see, daily organic visits plummeted as a result of the MayDay update. That’s something I see in about 40% of the Panda audits I’ve done.

From there, I then jumped to the current year and noticed that, in this case, MayDay was a precursor to Panda. Big time.

PandaRecoveryMayDayPanda The Holy Grail of Panda Recovery   A 1 Year Case Study

What’s interesting to note from this view is the site wasn’t caught up in the original Panda. The hit came the day of Panda 2.0.

Okay, let me rephrase that. Both of my claims about the site being hit so far, MayDay and Panda 2.0, are ASSUMPTIONS. Because we work in a correlation industry and since Matt Cutts refuses to poke this SEO bear, I doubt he’ll ever give me a personal all-access pass to his team’s inner project calendar.

MattCuttsPokeSEORantBear The Holy Grail of Panda Recovery   A 1 Year Case Study

Correlation-Based Assumptions

Since we don’t have the ability to know exactly what causes things in an SEO world, we can only proceed based on what Google chooses to share with us. This means it’s important to pay attention when they do share, even if you think they’re being misleading. More often than not, I have personally found that they really are forthcoming with many things.

That’s why I rely heavily on the work of the good Doctor when it comes to keeping me informed because I can’t stay focused on Google forums info. I’m talking about the work Dr. Pete has been doing to keep the SEOMoz Google Update Timeline current.

An Invaluable Correlation Resource

The SEOmoz update timeline comes into play every time I see any significant drop in a site’s organic traffic after they call me in to help them. I can go in there and review every major Google change that’s been publicly announced all the way back to 2002. You really need to make use of it if you’re doing SEO audits. As imperfect as correlation work can be, I’ve consistently found direct matches between major site drops and people coming to me for help with entries in that timeline.

The Next Step – Is It Actually a Google-Isolated Issue?

Once I see a big drop in the organic traffic for a site, I dig deeper to see if it’s actually a Google hit or if it might be all search engines or even a site-wide issue. Believe me, this is a CRITICAL step. I’ve found some sites that “appeared” to be hit by search engine issues, but it turned out that the site’s server hit a huge bottleneck. Only by looking at ALL traffic vs. Organic Search vs. Google specific will you be able to narrow down and confirm the source.

In this case, back during the Panda period, organic search accounted for close to 80% of site-wide traffic totals. Direct and Referrer traffic showed no dramatic drop over the same time period, nor did traffic from other search sources.

Curiosity Insight – Google Probes Before Big Hits

Not always, but once in a while when Google’s about to make a major algorithm change, you’ll be able to see (retroactively in hindsight) that they turned up the crawl volume in big ways.

PandaRecoveryCrawlJump The Holy Grail of Panda Recovery   A 1 Year Case Study

 

The SEO Audit – Process

Here’s where the rubber meets the road in our industry. You can wildly guess all you want about what a site needs to do to improve. You can just throw the book at a site in terms of “the usual suspects.” You can spend a few minutes looking at the site and think, “I’ve got this.” Or you can spend hundreds of hours scouring thousands of pages and reviewing reams of spreadsheet reports and dashboards, all to come up with a plan of action.

Personally, I adopt the philosophy that a happy middle ground exists, one where you look at just enough data, examine just enough pages on a site, and spend just enough time at the code level to find those unnatural patterns I care so much about because it’s the unnatural patterns that matter. If you’ve examined two category pages, you’re almost certainly going to find that the same fundamental problems that are common to those two pages will exist across most or all of the other category pages. If you look at three or four product pages (randomly selected), any problems that are common to most or all of those pages will likely exist on all of the thousands or tens of thousands of product pages.

So that’s what I do. I scan a sampling of pages. I scan a sampling of code. I scan a sampling of inbound link profile data and social signal data.

Comprehensive Sampling

Don’t get me wrong. I don’t just look at a handful of things. My strategic audits really are comprehensive in that I look at every aspect of a site’s SEO footprint. It’s just that at the strategic level, when you examine every aspect of that footprint, I’ve personally found there’s not enough of a significant gain when performing more than a sampling review of those factors to justify the time spent and the cost to clients. The results that come when you implement one of my audit action plans has proven over the last several years to be quite significant.

Findings and Recommendations

For this particular site, here’s what I found and what I ended up recommending.

The site’s Information Architecture (IA) was a mess. It had an average of more than 400 links on every page, content was severely thin across most of the site, and the User Experience (UX) was the definition of confused.

PandaCaseStudyHP1 The Holy Grail of Panda Recovery   A 1 Year Case Study

Top Half of the Pre Audit Home Page

Note that the screen shot above just displays the TOP HALF of the pre-audit homepage. Put yourself in a user’s shoes. Sure, some users might find a homepage with this design gives them a lot of great choices for finding information. In reality, all this did was confuse most users. They spent a lot of time trying to digest all the options, and the overwhelming majority of users never actually clicked on most of those links.

That’s why click-through data is so vital at this stage of an audit. Back when I did this, Google didn’t yet have their on-page Analytics to show click-through rates, and I didn’t want to take the time to get something like Crazy-Egg in place for heat maps. Heck, I didn’t NEED heat maps. I had my ability to put myself in the common user’s shoes. That’s all I needed to do to know this site was a mess.

And the problems on the homepage were repeated on the main category pages, as well.

PandaRecoveryCategoryPage The Holy Grail of Panda Recovery   A 1 Year Case Study

Pre-Panda Recovery Category Page

 

Topical Confusion of Epic Proportions

Note the little box with the red outline in the above screen capture. That’s the main content area for the category page, and all those links all around it are merely massive topical confusion, dilution, and noise. Yeah, and just like the homepage, you’re only looking at the TOP HALF of that category page.

Duplicate content was ridiculous. At the time of the audit, there were more than 16,000 pages on the site and thousands of “ghost” pages generated by very poor Information Architecture, yet Google was indexing only about 13,000. Article tags were automatically generating countless “node” pages, and duplication along with all the other issues completely confused Google’s ability to “let us figure it out.”

Site Speed Issues

One of the standard checks I do when conducting an audit is run a sampling of page URLs through Pingdom and URIValet to check on the speed factor. Nowadays I also cross reference that data with page level speed data provided right in Google Analytics, an invaluable three-way confirmation check. In the case of this site, there were some very serious speed problems. We’re talking about some pages taking 30 seconds or more to process.

Whenever there are serious site speed issues, I look at Pingdom’s Page Analysis tab. It shows the process time breakdown, times by content type, times by domain (critical in today’s cloud-driven content delivery world), size by content type, and size by domain. Cross reference that to URIValet’s content element info, and you can quickly determine the most likely cause of your biggest problems.

When Advertising Costs You More Than You Earn from It

In the case of this site, they were running ads from four different ad networks. Banner ads, text link ads, and in-text link ads were on every page of the site, often with multiple instances of each. And the main problem? Yeah, there was a conflict at the server level processing all those different ad block scripts. It was killing the server.

Topical Grouping Was a Mess

Another major problem I found while performing the audit was not only was there a lot of thin content and a lot of “perceived” thin content (due to site-wide and sectional common elements overcrowding the main page content), but the content was also poorly organized. Links in the main navigation bar were going to content that was totally not relevant to the primary intent of the site. Within each main section, too many links were pointing to other sections, and there was no way search engines could truly validate that “this section is really about this specific topical umbrella.”

Topical Depth Was Non-Existent

Even though the site had tens of thousands of pages of content, keyword phrases had been only half-heartedly determined, assigned, and referenced. Match that factor to the over-saturation of what was ultimately links to only slightly related other content, a lack of depth in the pure text content on category and sub-category pages, sectional and site-wide repetition of headers, navs, sidebars, callout boxes, and footer content, and the end result was that search engines couldn’t really grasp that there really was a lot of depth to each of the site’s primary topics and sub-topics.

Inbound Link Profile

A review of the inbound link profile showed a fairly healthy mix of existing inbound links; however, there was a great deal of weakness in volume specific to a diversity of links using anchor text variations, especially when comparing the site’s profile to that of top competitors.

Putting Findings Into Action

Another critical aspect of my audit work is taking those findings and putting them together in a prioritized action plan document.  Centric to this notion is knowing what to include and what to leave out. For a site with this much chaos, I knew there was going to be six months of work just to tackle the biggest priorities.

Because of this awareness, the first thing I did was focus my recommendations on site-wide fixes that could be performed at the template level. After all, most sites only have one, two, or a few primary templates that drive the content display.

From there, I leave out what I consider to be “nice to have” tasking, things a site owner or the site team COULD be doing but that would only be a distraction at this point. Those are the things that can be addressed AFTER all the big picture work gets done.

Tasking for This Site

After the rest of my audit review for this site, I came up with the following breakdown of tasking:

Improved UX

  • Eliminated off-intent links from main nav
  • Reordered main nav so most important content nav is to the left
  • Reduced links on home page from 450 down to 213
  • Rearranged layout on home page and main category pages for better UX

Addressed server speed and script noise issues related to ad networks

  • Removed in-text link ads
  • Reduced ad providers from four networks down to two
  • Reduced ad blocks from seven down to five on main pages
  • Reduced ad blocks on sub-cat and detail pages from nine down to seven
  • Eliminated ad blocks from main content area

Refined keyword topical focus across top 100 pages

  • Rewrote page titles
  • Modified URLs and applied 301s to old version
  • Rewrote content across all 100 pages to reflect refined focus
  • Reorganized content by category type
  • Moved 30% of sub-category pages into more refined categories
  • Eliminated “node” page indexing

Inbound Link Efforts

  • Called for establishing a much larger base of inbound links
  • Called for greater diversity of link anchors (brand, keyword and generic
  • Called for more links pointing to point to deeper content

________________________________________________________________

Here’s a screen capture of the top half of the NEW, IMPROVED home page.

PandaCaseStudyHP2 The Holy Grail of Panda Recovery   A 1 Year Case Study

Post Panda Home Page top half

Yeah, I know. You probably have an opinion on colors, aesthetics, and whatnot. I do, too. However, the issue wasn’t about certain aesthetics. That is exactly the kind of issue that can be dealt with AFTER all the priority work happens.

Massive Changes Take Time

Given how much work had to be done, I knew, as is usually the case with action plans I present to audit clients, that they were in for a long overhaul process that would last several months. I knew they’d start seeing improvements as long as they followed the plan and focused on quality fixes, rather than slap-happy, just-get-it-done methods.

Results Are Epic

The results that have occurred over the course of the past year have been truly stunning to say the least. I knew the site would see big improvements in organic data. I just had no idea it would be on this scale.

PandaRecoveryChartPlot The Holy Grail of Panda Recovery   A 1 Year Case Study

Here are three screen captures of the actual Google Data, just so you can see I’m not making all of this up.

 

PandaRecoveryPrePanda1 The Holy Grail of Panda Recovery   A 1 Year Case Study

The Panda Effect

PandaRecoveryPostPanda The Holy Grail of Panda Recovery   A 1 Year Case Study

Post Panda – Audit Work Starts Rolling Out

PandaRecoveryOneYearLater The Holy Grail of Panda Recovery   A 1 Year Case Study

What A Difference Sustainable SEO Makes

Takeaways

As you might venture to guess, the recovery and subsequent, continued growth has been about as “best case scenario” as you can get. It truly is one of my proudest moments in the industry. The fact that not one single other site had been reported as having yet recovered at the time I performed the audit just means more than words can describe.

Sustainable SEO Really Does Matter

The biggest takeaway I can offer from all of this is that sustainable SEO really does matter. Not only has the site rebounded and gone on to triple its organic count, it has sailed through every other major Google update since. What more can you ask for in correlation between the concept of sustainable SEO best practices and end results?

UX – It’s What’s for Breakfast

I drill the concept of UX as an SEO factor into every brain that will listen. I do so consistently, as do many others in the search industry. Hopefully, this case study will allow you to consider more seriously just how helpful understanding UX can be to the audit evaluation process and the resulting gains you can make with a healthier UX.

IA – An SEO Hinge-Pin

With all the IA changes  made, Google now indexes close to 95% of the site content, more than 17,000 pages versus the 13,000 previously indexed. Their automated algorithms are much better able to figure out topical relationships, topical intent, related meanings, and more.

Ads as Shiny Objects

This was a classic case study for many reasons, one of which is the reality that a site owner can become blinded to the advertising revenue shiny object. Yet there comes a point where you cross over the line into “too much.”  Too many ads, too many ad networks, too many ad blocks, too many ad scripts confusing the server…you get the idea. By making the changes we made, overall site revenue from ads went up proportionally, as well, so don’t let the concept that “more is always better” fool you.

Not Provided Considerations – Long Tail Magic

This observation applies to just about any site, but it’s more of a comfort for medium and large-scale sites with thousands to millions of pages.

If you read the first spreadsheet I included here, you’ll note that it appears that the total number of keywords used to discover and click through to the site spiked and then leveled off. However, if you were to look more closely at that data, you’d also observe that the plateau is directly related to both “Not Provided” and “Other,” with “Other” actually accounting for more keyword data being in one bucket than “Not Provided” (approximately 13% vs. 8%).

The more critical concept here is even though you may in fact be missing opportunities due to those two buckets, just look at the other aspect of the keyword count.

PostPandaLongTail The Holy Grail of Panda Recovery   A 1 Year Case Study

First, by cleaning up and refining the topical focus on the 100 most important pages and instilling proper SEO for Content Writing into the client mindset, we’ve taken the long tail to a whole new level, a level that people may have difficulty comprehending when I talk about REDUCING the number of keywords on a page.

Yet it’s that reduction and refinement, coupled with truly high-quality content writing that is GEARED TOWARD THE SITE VISITOR that allows search engines to actually finally do  a really good job of figuring out that all the variations for each page and each section of the site really do match up from a user intent perspective.

If you were to unleash the “Other” and “Not Provided” phrases into the total count, you’d see that the site now generates visits via more than a million different phrases over the past six-month period. A MILLION PHRASES. For all the potential opportunities in “Not Provided,” how much time would YOU be able to spend analyzing over a million phrase variations looking for a few golden nuggets?

Personally, my client and I are quite content with not being able to dig into that specific metric in this case. So while it’s a bigger challenge for much smaller sites, when you’re at this level, I’m more concerned that Google says, “You have so much data we’re just going to group hundreds of thousands of phrases into ‘Other’.”

In Conclusion

Well, there you have it, my very first Panda Recovery Audit a year later. And for the record, the findings and recommendations I detailed in this case study were presented at SMX Advanced 2011. At the time, some of the feedback I got was “That’s nice, but it’s all speculation. Nobody’s told us this is for sure how to fix Panda.” #WIN

 

12bcd73262dd3dcb8597e6d4f9884119 64 The Holy Grail of Panda Recovery   A 1 Year Case Study
Alan Bleiweiss is a Forensic SEO audit consultant with audit client sites consisting of upwards of 50 million pages and tens of millions of visitors a month. A noted industry speaker, author and blogger, his posts are quite often as much controversial as they are thought provoking.
12bcd73262dd3dcb8597e6d4f9884119 64 The Holy Grail of Panda Recovery   A 1 Year Case Study

You Might Also Like

Comments are closed.

57 thoughts on “The Holy Grail of Panda Recovery – A 1-Year Case Study

  1. After reading this article I’ve come to the conclusion that this Ryan Jones guy is a genius.

    Seriously though, awesome post. It’s always fun to see people’s thought processes and how they approach problems. Thanks for sharing, was definitely a great read.

  2. Really great case study Alan.

    This is a great example for all seo that all the process is important and that there are no special factors, but only a lot of best practices and common sense.

  3. Alan, as I savored each word of your case study, I realize this is the type of process and documentation that is MISSING in today’s colleges.

    SEOs and webmasters can gain more than traffic but improved usability and understanding of topical focus by reading the principles you’ve outlined. You DESERVE to be proud in the outcome!

    Nice hat tips for referencing Dr. Pete’s Google Update Timeline and to Edward Lewis’ URI Valet. Both are invaluable resources!

    Kudos, my friend!

  4. Spot-On great case study here Alan! I too look at the UX experience – it is an item that too too many web designers never get right IMHO! Will post the URL to this to some of our own clients too!

    :-)

    Jim

  5. Fantastic. I will be adding this to my educational tools for the newbs among us. This is how you execute good SEO.

    Thank you for sharing your knowledge!

  6. Looking at the screenshot of how the site was first designed you could see the problems. Good work on getting it fixed and looking better.

  7. Alan,

    Loved taking this ride through your approach, process and the results after implementation. Thanks for sharing this excellent example of “sustainable SEO.”

    1. The UX guides people to find things. If the UX is mud, finding things is more problematic. If people can’t find things, they aren’t happy.

      UX is also inherent to the bigger Information Architecture that drives that UX. IA as a whole (including UX) dictates content relationship perception, both to humans and bots.

      THat’s a very over-simplified explanation, however it’s the crux of the concept.

    2. Also, while the robots follow links, they also provide their content to the indexing systems and datgabases. If most of that “content” isn’t text but random topic links, that definitely doesn’t help.

  8. It’s great to read this as we’ve been including UX in our audits too the past few years. Because we focus on conversion (depending on what the goals are) more than just SEO in isolation it made more sense to me and I’m really glad to see it more prevalent now. My outset is always: “Let’s see how we can get the existing traffic converting better” and it naturally improves SEO… Well done Alan!

  9. Great writeup and a job well done.

    I am amazed at how many sites are out there with bloated pages doing nothing but causing problems. It is interesting to me how the crawl rate went up so much before the hammer fell on the site.

    Maybe Mr Cutts put in a factor that if the Google bot gets lost in a site it gets a penalty just for being bloated.

  10. Great piece of insight on Forensic SEO audits. Definitely laid out a clear process for all of us to follow when dealing with E-Commerce clients.

    One thing I did though was to look at 2 of my client’s crawl rate pre-panda. Instead of what you mentioned, the crawl rate actually decreased before and during the same period. I’m still trying to figure out a plausible explanation for this. Perhaps you can shed some light?

    1. Karl,

      I honestly can’t explain why you had a different result. One thought (a wild guess) is some sites got an increase in crawl rate because they had been flagged by Google due to a 1st round match process with Google’s system. “Hey – what are the key indicators we can use to see if a site might need to be pandalized. – Okay these sites over here meet those criteria. Let’s ramp up crawl rate on a massive scale to see if we’re right, and if so, let’s maul those suckers.”

      From there, IF that is what happened, it could literally be a case where they said “okay good – our deeper look proved we’re spot-on with this update – lets turn it loose on the whole index…”

  11. Thanks, your article was very useful to me. I have a question: the high number of internal links on the home page was the main problem? or only a secondary factor.

    1. Salvino

      I’m not 100% sure if it was a primary factor or secondary. I tend to think it’s secondary, while still being key to overall high quality SEO. Google says “put a link to it on your home page if you want us to know it’s important”. If you have too many links, that could devalue what really is more important.

    2. The other consideration is UX. If there is too much going on, it confuses topical focus, delays finding things that are important. That then reflects directly on the devaluation problem I just suggested.

  12. It was pointed out that in my initial write-up of this case study I failed to mention inbound link aspects. I have since added in what I found related to the inbound link profile during my audit and the subsequent recommendations called for related to links. As noted in the original write-up, the client has implemented all of my recommendations. That holds true to the inbound link work now included in the case study…

  13. Hey Allan,

    i am glad to read this case study. Thanks to Tedster on webmasterworld for referring this. But I do have a few queries which I guess would help many. i have posted them on webmasterworld. Can you pls. answer them and help. I am posting the questions here as well.

    “Rewrote page titles
    Modified URLs and applied 301s to old version”.

    Any idea on why these two were done? What necessitated the title changes and were the title changes done for many pages? What will be its percentage to the total no.of pages on the site? Was this done en masse? What kind of changes are they?

    The next one is even more interesting. Any reason to modify URLs and apply 301s? Was this also done to a number of pages? What will be its percentage to the total no.of pages on the site?

    “Moved 30% of sub-category pages into more refined categories?”

    I am not understanding why this was done. Was it because the sub-category and the main category weren’t relevant to each other or loosely coupled?

    “Eliminated “node” page indexing”

    What do you mean by a “node” page? Why were such pages de-indexed? How was this done? i.e. Was it through “noindex, nofollow” or “noindex, follow” or through some other method?

    Lastly,you aren’t talking about the use of user metrics like time on page, bounce rate, exit rate etc. n the whole process! Did you use them or not?

    Thanks a ton for this write up.

  14. Rajesh,

    In the case study I explained how the topical focus was not very refined – there was too much dilution of topics on the site at the highest levels. To address this, it requires rewriting the page Titles, reinforcing that with more accurate URL structure, (and thus the need to then do the redirects), and reorganize content / rework content – all on the top 100 or so pages of the site. It’s a 17,000 page site, however those are the most important pages.

    All of that work dramatically improved the ability search engines have to determine proper topical intent on those critical site pages.

    As for the “node” indexing, the site was built on a content management system that by default, creates pages called nodes. They’re duplicate content problems on a massive scale. I don’t recall the method used to eliminate the problem (it’s been over a year now).

    Time on page, bounce rate, exit rate are part of the many things I review, however as I stated in the case study, a critical part of my work is to only include in the actual action plan those things I found that were both a problem and high priority. If things I looked at were, in my opinion, either not a problem, or alternately, a low priority problem, I left them out of the audit. As I do with all my audit work.

  15. The graph mentions that the recovery started as soon as they started implementing the changes. Is that really true? I’ve been hit by Panda on Apr 19th 2012 & penguine on Apr 24, 2012. After striping my site down to basics for user experience, I’m still waiting for recovery.

    1. every site is unique. there are too many considerations to know why one site bounces back faster than another without digging deeper into the SEO across the board. That’s just the reality and nature of our world.

  16. One conclusion I came with from reading other article on Panda X and other resources is that there is not SINGLE factor that can get site hit or increase in rank. It’s a combination of factors, of different weights and magnitudes which makes hard to always pinpoint it to a single factors. Fixing an issue on a large site that was done here is awesome work. Congrats!

    1. You are absolutely correct Malcom. In fact, I am sure some of the changes I recommended were NOT specific to Panda, or MIGHT have been partially related. The end result though is by taking all of the recommendations and implementing them, we not only resolved Panda, we stabilized and strengthened the site.

  17. Interesting article Alan…

    My question is, how confident were you that they would implement all of the suggested changes correctly and do you think the results would’ve happened faster if you did them yourself?

    1. OMG I am floored and honored that the SEO Rapper commented on one of my articles. That alone is epic! Okay then… I do my best to only work with clients who respect my opinion, yet I understand that many times budgetary constraints, resource limitations or corporate political landscapes prevent at least some of my recommendations from being implemented.

      Having said that, I was fairly confident in this case. Most clients who come to me after they’ve been hammered so thoroughly are usually pretty much willing. In this case, the client was beyond eager, enthusiastic and excited to get the work done.

      I was the one who performed the tactical audit that generated the revised content organization and new page Titles – so they got done properly the first time, based on my turnaround time. Other changes I left to the client or their team, or in this case, a 3rd party content resource (Vertical Measures), and there’s no way I would have gotten results faster on any of that work given my own work load.

  18. Good Article.
    But you had a website that was pretty bad and made everything wrong. If you have a site that already makes some seo, has a good UX and so on, and runs into Panda, that’s the real challenge!

    Anyway, very interesting Post

  19. Looking back on the series of updates someone in our office noted that it seems like we’re fighting an uphill battle here. We tracked organic traffic to on of our client’s sites and it seems every time visits start to increase another update comes. Uphill battle.

    1. Some people call that uphill battle “job security”. Seriously though, this is why SEO is a true professional services offering. It’s not so easy for a part-time effort when you get into competitive areas.

  20. I have read this carefully and I understand that you did not focus on the bounce rate but I would appreciate if you could give some feedback on my questions

    Q1
    Does Bounce Rate mean the user has arrived on a page on this site and left directly from that page

    or does it mean the user has exited from the site from the same page they arrived ?

    Q2
    I understand there was a lot of issues here but why would a bounce rate of 58 to 60% not make your priority list ?

    Q3
    Do you have any tips to give that could reduce the bounce rate ?

    Thanks
    David

    1. Bounce rate is the percentage of visitors who come to the site then leave without having visited any other pages.

      Bounce rate itself is NOT relevant to SEO because a site might have exactly what most people want, all on one page. This site for example, has a medical calculator that many people come for. When the get directly to that calculator, it takes about a minute on average to use it and get the info desired. No need to continue to other pages.

      So looking at this site, the average of two and a half pages per visit tells me enough people find the site worth sticking around. And the bounce rate is reasonable given that enough people find what they want on one page.

      So as long as you take the time to understand what you want people doing on your site, you can then evaluate whether bounce rate is more important than it might be on another site. User intent matched against content relevance is the key issue.

      Tips would need to be custom tailored to every site or even every unique aspect of every site. It requires understanding user mind models – what do my users want, how do they think and process information? What do they care about when they come to my site? Am I providing maximum fulfillment of their desired experience?

  21. Thanks for your answers. I created a report in google analytics for bounce rate and it shows visitors visited more than 1 page. So I decided to look it up on wikipedia and they say what you have answered yet I found on google an addition to the wikipedia answer which states – the same page entering same page leaving is also included in Bounce Rate. Which for me is very stupid as it does not reflect the real theory (as you have stated in your answer) of bounce rate. So I have decided to forget about bounce rate completely.

    I have went back to your article and I would like to thank you for the explanation of the Speed Tests. This is really cool. I could see problems on my site using google speed tests and by using your instructions I was able to determine that image resizing using tim thumb is what makes up most of the load time even after I have switched it off using an option inside my wordpress template. I am now looking at a lot of work to resize all my images and potentially redesign the whole site but hey if it means better UX and people dont get bored waiting for the home page to load then I am all for that.

    Thanks for a great article

    1. Hi David

      When the debate over whether Google used Google Analytics data to influence search result rankings came to a head during SMX Advanced this year, Matt Cutts stated emphatically that they do not use such data – and later he confirmed to me that means both the web spam team and the quality team. In those discussions within the community, bounce rate was one of the issues. The collective thinking is that the only aspect of bounce rate that matters regarding SEO specifically is pogosticking – doing a search, clicking to a site from the search engine, then coming back and doing another search or clicking to a different result link.

      Ultimately, bounce rate has to be taken with the user intent and it’s nearly impossible for search engines to even attempt to understand user intent / results quality specific to bounce rate in an automated way, so yeah, I only look at it from that perspective, and its only a minor signal unless it’s obviously bad.

      Sorry to hear you will have so much work in front of you. Yet the end result can only be a better experience all around so yeah, it’s worth considering.

  22. Hey Allen,

    I’ve found that ads and how they’re displayed can single-handedly affect rankings. Sites I’ve placed ads on drop rankings within weeks and when I remove or reorganize those ads, the rankings will move back. How much of an affect do you think ad placement and display affected rankings?

  23. One of my site was also hit bad by panda update,,,,but frequent posting ,guest blogging repaired it and also Google seems to consider almost all ads except adsense as spam and it is good as before,,

  24. Hi Alan, many thanks for sharing your case study with us. I agree with all that you’ve said and also learned a thing or two during my read. It’s funny, no matter how much you think you know, there’s always more to learn.

    I have a large site that was hit on January 20th, it’s difficult to tell whether it was Panda or ATF. Especially as we have a max of 1 ad per page and yet the drop corresponds more closely to ATF. I’ve battled long and hard with it. But I’m going to look at it again tomorrow, with fresh eyes, based on your case study.

    Thanks again,

    Paul

  25. Hi Alan

    Just taken over management of a bunch of flooring websites all hit with the same above mentioned issues, your post has helped put my own findings in perspective, given me more detail to think about and also added some order.

    FYI I picked up this article as a recommendation from Tedster on Webmasters World – he was right it was definitely worth a read!

    Bob

  26. One of the best posts I’ve ready in a very long time, great job Alan. UX is something that we used to focus more on, but this just proves that it’s something to always be considering.

  27. Alan,

    you deserve a massive thumbs up for this post. One thing that is clear to me is that it doesn’t matter what algorithm change Google releases, the most important thing search marketers, seo et al focus on the following as you’ve clearly explained.

    Build a website with the user in mind not the search engines (UX)

    Get your site architechure right to allow smooth, well laid out navigation

    Write unique content worth reading

    Ensure that your link profile is natural with a good mix of ancor text

    Get these things right and you will be a winner!

  28. Just wanted to thank you for taking the time to share this information. Of particular note was your advice on information architecture and improved UX. Keep up the good work, it’s much appreciated.

  29. I think the key takeaway here is to take away. Almost all the on-page strategy you used to recover revolves around removing extraneous junk, decluttering, making choices as to what’s really important. For so long, throwing as much crappy content and spammy linkbuilding at the wall seemed like a super idea. Not to say that everyone was doing it to manipulate rankings. I think a lot of the time it happens because people are natural hoarders and they have a hard time letting go. They think everything is priority #1 and can’t be cut from the home page (for example). Sometimes it takes an outside source (like your friendly neighborhood SEO or UX designer) to mercilessly take the snippers to stuff that’s hurting more than it’s helping. Good read. Now that said, I need to take this article and a fresh set of eyes to a lil site I just got handed and see if I can bring it back from the dead. :) Thanks for the good read, Alan. Much appreciated.

    1. Heather,

      Thank you for bringing to the notion of “too much” and the need to “declutter” as a key concept. It’s quite often so true. Sometimes it’s also that combined with a lack of depth to content specific to a page’s core focus. Yet that effort can be made so much less challenging when the page is cleaned of the junk and the clutter…

  30. Hi Alan,

    an excellent and well articulated case study, many thanks I learned much.

    I was interested to see on page advertising can really be a case of less is more.

    A great read

    Regards

    Steve

  31. Nice study on google panda update. One of the website that I own is hit by Panda update it seems. Don’t know probably because of sharing too many youtube videos