SEO

Are You Making These 7 Panda-Punishing Content Mistakes?

The Google Panda update has definitely changed the SEO landscape. In a great short video Rand explains how it changed SEO best practices forever. SEOs are now responsible for:

  • Design and user experience
  • Content quality
  • User and usage metrics.

In a way your job has been upgraded to web strategist. In the past you only had to worry about generating content. Now you have to worry about generating high-quality content.

Fortunately Google has shared plenty of clues on how to do that like encouraging us to think like engineers who ask these 13 questions to generating SEO friendly posts.

That’s great for the future, but what about the content that exists on your site now…the content that you created based on the old algorithm rules? If you don’t fix it, Panda could hurt you.

Here’s what you need to look for and how to fix it.

Internal True Duplicates

When it comes to duplicate content, this is the most basic. Internal true duplicate content is what happens when you create unique URLs for the same piece of content.

For example, let’s say I want to promote How Social Media affects SEO. Well, there is the original URL. Then I create a special promotion, off of a subset of my root domain: quicksprout.com. Let’s call it quicksprout.com/promotion.

Well, off that parent I build a URL like this: quicksprout.com/promotion/how-social-media-affects-SEO.

As you can see, I now I have two URLs pointing to the same content on my site. These are internal true duplicates…and when found Google will penalize if they think you are trying to game the algorithms.

That’s important to realize: you may not be trying to game. You simply want to promote. Google, unfortunately, can’t know your intentions. Yet.

The more URLs you build to point to that page, by the way, the more it seems you are trying to game Google and the more likely you are going to be penalized site wide for the bad practice of these pages.

Takeaway: To take of this problem, you simply have to eliminate the duplicate content. Fresh, unique original content across your site is the only thing that Google wants to see.

Outside Site True Duplicate

If you syndicate your content across domains, whether they are yours or somebody else’s, your site could be penalized because if Google sees the same content across multiple sites it will look like SERP noise.

For example, say for every article I write on Quick Sprout I then distribute it to syndicators. Let’s say I do this for “How Social Media Affects SEO.” If you then search and SERPs throw up multiple URLs for the same article…that’s noise that Google doesn’t like and will likely penalize.

Takeaway: To fix this problem if you own all the domains is to create a cross-domain canonical tag. Basically, you need to choose which URL is going to be the source, and redirect to that.

If you don’t own all the content properties, then you need to work with the webmasters to fix that problem…leading to a time-consuming task. By the way, do I recommend syndication? No, because of the reasons I mention above.

Special note: Sometimes scrapers will cause duplicate content problems for you cross the web. Look at this SERP for one of my articles:

image0012 Are You Making These 7 Panda Punishing Content Mistakes?

That’s a lot of scraping going on. Fortunately the original article is on top. Why? I’ve got site authority over the scraper sites.

So, the moral of the story: If you are getting beat by scraper sites, then you need to build your site authority. If that doesn’t help, then issue a DMCA takedown.

Internal Near-Duplicates

Sites that are trying to rank high for locations or themes will get penalized here because often what they will do for content is generate an article on a topic and then change it slightly based upon a few strategic keywords and headers.

You might see this with national companies trying to rank high for individual city SERPs, like “best Seattle insurance” or theme-wise like “child head injury trauma” or “child neck injury trauma” where the introduction copy is changed to reflect the key words, but the bulk of the remaining content is directed to a guardian’s rights…and that content is the same across all pages.

Notice how this search factors here for “child injury trampoline lawyers”:

image003 Are You Making These 7 Panda Punishing Content Mistakes?

Here’s the content when you click through:

image005 Are You Making These 7 Panda Punishing Content Mistakes?

Notice the only mention of “trampoline” is that one bullet point? When Google crawls this content, it sees only a few lines of copy have changed and immediately think low-quality. One reason it probably sits at number four. I’m not surprised it doesn’t sit lower in the rankings.

This can also happen to large ecommerce sites when a product has multiple variations like size and color. The copy stays the same, just a few words change based on these variations.

Takeaway: The only truly suitable way to fix the geo-location or theme-based duplicate content is to re-write the content so it is fresh and original. That’s time-consuming and expensive, but you have to ask yourself…is it worth it not to be penalized?

The same holds true for the product pages of an ecommerce site, but it’s not always feasible to write fresh original content for each variation of a product…so create one and then block the dynamically-generated pages from getting crawled with meta noindex tags.

External Near-Duplicates

Affiliates and partners may sometimes grab content and product pages from your site. This creates near-duplicate content across the web, which will get similarly treated by Google as true duplicates a cross the web.

Takeaway: To beat out the borrowed content you have to beef up with unique content, usually done quite effectively and without much cost or time through user-generated content. Just two or three UGC pieces and a unique copy intro to your content will do the trick. You can focus on your top 10% best performing products and test the results.

Search Within Search

Large sites, especially ecommerce, tend to store pages and pages of internal search results. This means that when a user searches and on the SERPs is one of these search pages, Google will frown upon it.

The reason they frown on it is Google doesn’t want to deliver another search page to a user. It wants to deliver a relevant, useful page of information to the user. So if they spot these pages they will mark them as spam and penalize.

Takeaway: Obviously this doesn’t impact small or most medium-sized sites, so it may not be of interest to you unless you are responsible for a large site. To control this problem, then, you must block or noindex the pages from being crawled. It’s time consuming, but must be done.

High Structure Site Design, Low Copy Density

It seems like Google will also penalize those sites that put a heavy emphasis on a design template and neglect the copy portion.

For example, here is a page from WebMD.

WebMD Frame 637x477 Are You Making These 7 Panda Punishing Content Mistakes?

The problem with this is it creates a space that is dominated by repeating themes like megafooters, excessive navigation, dynamic content or repeated images. Content quality could be high, but these signals indicate low-quality.

Takeaway: Write some original, detailed content that signals high quality content and tone down on the design template. I’ve learned a lot from my own experience with site design and testing it with you and have discovered that simplicity is king. Determine what your site design can live without and eliminate it. Test all of this, of course.

High Ad Ratio

This one and the last one are similar in that they are design related, but this one deserves its own section because it is a huge issue.

Webmasters want to make money off of their sites with the passive income that comes from ads…so they unfortunately think if one ad makes me $100 a month…then 10 should make me $1,000. The problem high ad ratio is a signal the Panda quality control panel indicated as noise…and gave it a thumbs down. This is even true if you are producing high-quality content. You may not think it is fair, but users prefer fewer ads…which means so does Google.

Takeaway: Limit the number of ads on your site to the top two or three highest performing. In fact, if you are tracking your ads you should know that not all of them are performing the same. Pull the ones that aren’t working.

Conclusion

When it comes down to it, you also should highly consider removing your lowest-performing pages. These are the pages with few if any incoming links, low visit rate or high bounce rate. You can rehab the pages if you want, but I found it easier to just scrap it and start again. Here’s how to identify low-performing pages.

And regarding duplicate content, you can use a couple of tools to help you identify it: Copyscape or Duplicate Content.

How has your site and content been impacted by Google Panda? Do you have duplicate content issues? And how did you handle them?

 

 Are You Making These 7 Panda Punishing Content Mistakes?
Neil Patel is the co-founder of KISSmetrics, an analytics provider that helps companies make better business decisions. Neil also blogs about marketing and entrepreneurship at Quick Sprout.

You Might Also Like

Comments are closed.

25 thoughts on “Are You Making These 7 Panda-Punishing Content Mistakes?

  1. A website with a lot of ads just looks spammy. A business website shouldn’t have them. It’s just a distraction. What the Google Panda update did was remind website owners what was important, and that is focusing on the needs of human visitors, not the search engine spiders.

  2. Neil:

    While I agree w/ 99% of the above I have to chime in on syndication. Prolific content marketers which get syndicated frequently from other sites and webmasters have a decision to make if they can’t get other sites to deploy a cross-domain canonical tag. Either disallow the syndication of the content or live w/ it.

    Here’s the issue:

    For prolific content marketers organic traffic tends to out pace syndicated referral traffic, but syndicated referral traffic converts at a much higher rate. In my case, organic traffic has a 1.43% conversion rate and referring sites from syndication boasts a conversion rate of 6.14% (26 months of data). Because of this the vast majority of my conversions come from syndication.

    If I had to make the business decision between search and syndicated referral traffic the choice is easy. Would much rather take the chance of getting punish a little by Google than lose syndication.

    @CPollittIU

  3. I came to know about duplicates in my blog which runs on WordPress having Internal Duplicates in terms of parameters, and Google showing indexed pages as high as 40000+ , with a huge supplemental index.
    I’ve bought them down to 1200+ over 2 months by setting up 301 redirects and manually submitting ( 3000 + ) URL Removal Requests through Google Web Masters.

    The current status is 600+ in the primary index and 600+ in the supplemental index.

    What do you say, should I apply for a reconsideration requesting considering the internal duplicates have been addressed ?? Your suggestion on this would be appreciated.

  4. Thanks for this info and your time, Neil. I picked up some new things to consider here – didn’t realize the search-within-search reality.

    Nick makes a good point above; Panda prompted Web masters to remember the users. I had a “plain clothes” discussion with a Web master who had aspirations to rank well for a nationwide service.

    Naturally, they wanted to rank well in all major cities. To begin, attempting to rank for awkward phrases, uniting services and geo points by inserting them in your copy, makes for an alarming read (by humans). Additionally, much to the disappointment of the Web master, I dismissed their “quick fix” content aspirations, suggesting creating separate sites or respective sections of a central site, creating a “unique” experience aligned to each service location, rather than reiterating the same content, switching out a few “key” geographically-aligned terms.

    I used to teach English; I often see parallels between “good” students who put in the time studying, who did their “homework” and Web masters who align sites and endeavors with “best practices.” Unfortunately, there were other lazy students, who thought a quick glance at “cliff notes” would suffice and make the grade; I had to break out the red pen on the less motivated; Panda does much of the same for Web masters who don’t want to become “good students” of best practices.

  5. Very good, comprehensive article, Neil. Panda has changed things permanently and we SEOs will need to change with it. I’m eager to see what the next release will do.

  6. I gotta disagree w a few things. You don’t know what Google penalizes sites for.

    You can’t prove it!

  7. I think adding plenty of relevant images and videos in the article helps instead of just text. See this articles uses a lot of images for illustrations too. Thanks for the good tips.

  8. Hello Neil, I really appreciated your article and especially the terminology for labeling each variation of duplicate content. This being said, I tend to disagree on one point: the possibility of being penalized because of syndicated content. I do think that the issue here is to be indexed first and then other copies appearing would normally be less considered by Google. If it were so easy, wouldn’t it be enough to scrape a competitors content on a regular basis and get him penalized?
    There are successful authors (and I do think you are one of them :-) producing great content, this will naturally be replicated (scoopit for example) across the web, it would be unfair if your site were to be penalized due to that.

  9. thanks, very nicely written article.. Google Panda has changed SEO completely but it has changed it for the better .. now design also is important .. making a website only for the purpose of making money is not good, you have got to think about your users also now .. concerning duplicate content, i find it right google to penalise duplicate pages .. duplicate content makes the search listings look bad and spammy

  10. Thank you for the timely update. We all need help in keeping current with the ongoing changes at Google.

    One practice that was not mentioned is the technique of pointing multiple key word rich domain names to the same site. This was widely touted some time ago. It would seem that this practice would now trigger duplicate content alarms as well.

    True?

    1. That’s an interesting question. I think it can trigger questions about the sites being related or dependent which is clearly looked at by search engines. However if the outbound links are not all (or the vast majority) pointing to the same site, I wouldn’t think that just because of the domain name there would be a problem, this is supposed to increase relevancy for the target site on those keywords. You might also be describing a linkwheel, and that could get you into trouble for sure, with or without duplicate content.
      Any other views on this issue?

  11. You were right on the money with the content density and usability mentions (given the Google announcement about the content above the fold update just a day or so later… or do I have my dates wrong?)! Nice work (as usual).

  12. Hey Niel, you can go through my blog and check. I have implemented the same practices which you have mentioned in the blog and it really works. Thanks again for your valuable article.

  13. I wonder if searchenginejournal.com or searchenginewatch.com are planning to make some changes to their current layouts? These sites are currently ads heavy.

  14. Very interesting article, I didn’t know about the low copy density in the webdesign, although in your example there are many ads. Does it mean we don’t have to be very creative and better focus on content ? It sounds weird to me…

  15. I worked hard on the content of a site pertaining to women’s fashion accessories and reaped rich results in rankings. However, recently i made the mistake of using spun content on the my site and the ranking dropped from first to fourth page within 2 days. That was the first real lesson to me on SEO. Up until then i was only writing original authentic content. Thankfully the spun content was only for 2 posts and i removed them immediately after 2 days of publishing. What i want to ask is that even though those posts contained spun content, they both passed through copy scape. Only on manual reading you know that content quality is not good. Does that mean Google also analyzes the quality score of content? Even if it passes through copyscape?