Are You Making These 7 Panda-Punishing Content Mistakes?

SMS Text
Are You Making These 7 Panda-Punishing Content Mistakes?

The Google Panda update has definitely changed the SEO landscape. In a great short video Rand explains how it changed SEO best practices forever. SEOs are now responsible for:

  • Design and user experience
  • Content quality
  • User and usage metrics.

In a way your job has been upgraded to web strategist. In the past you only had to worry about generating content. Now you have to worry about generating high-quality content.

Fortunately Google has shared plenty of clues on how to do that like encouraging us to think like engineers who ask these 13 questions to generating SEO friendly posts.

That’s great for the future, but what about the content that exists on your site now…the content that you created based on the old algorithm rules? If you don’t fix it, Panda could hurt you.

Here’s what you need to look for and how to fix it.

Internal True Duplicates

When it comes to duplicate content, this is the most basic. Internal true duplicate content is what happens when you create unique URLs for the same piece of content.

For example, let’s say I want to promote How Social Media affects SEO. Well, there is the original URL. Then I create a special promotion, off of a subset of my root domain: quicksprout.com. Let’s call it quicksprout.com/promotion.

Well, off that parent I build a URL like this: quicksprout.com/promotion/how-social-media-affects-SEO.

As you can see, I now I have two URLs pointing to the same content on my site. These are internal true duplicates…and when found Google will penalize if they think you are trying to game the algorithms.

That’s important to realize: you may not be trying to game. You simply want to promote. Google, unfortunately, can’t know your intentions. Yet.

The more URLs you build to point to that page, by the way, the more it seems you are trying to game Google and the more likely you are going to be penalized site wide for the bad practice of these pages.

Takeaway: To take of this problem, you simply have to eliminate the duplicate content. Fresh, unique original content across your site is the only thing that Google wants to see.

Outside Site True Duplicate

If you syndicate your content across domains, whether they are yours or somebody else’s, your site could be penalized because if Google sees the same content across multiple sites it will look like SERP noise.

For example, say for every article I write on Quick Sprout I then distribute it to syndicators. Let’s say I do this for “How Social Media Affects SEO.” If you then search and SERPs throw up multiple URLs for the same article…that’s noise that Google doesn’t like and will likely penalize.

Takeaway: To fix this problem if you own all the domains is to create a cross-domain canonical tag. Basically, you need to choose which URL is going to be the source, and redirect to that.

If you don’t own all the content properties, then you need to work with the webmasters to fix that problem…leading to a time-consuming task. By the way, do I recommend syndication? No, because of the reasons I mention above.

Special note: Sometimes scrapers will cause duplicate content problems for you cross the web. Look at this SERP for one of my articles:

That’s a lot of scraping going on. Fortunately the original article is on top. Why? I’ve got site authority over the scraper sites.

So, the moral of the story: If you are getting beat by scraper sites, then you need to build your site authority. If that doesn’t help, then issue a DMCA takedown.

Internal Near-Duplicates

Sites that are trying to rank high for locations or themes will get penalized here because often what they will do for content is generate an article on a topic and then change it slightly based upon a few strategic keywords and headers.

You might see this with national companies trying to rank high for individual city SERPs, like “best Seattle insurance” or theme-wise like “child head injury trauma” or “child neck injury trauma” where the introduction copy is changed to reflect the key words, but the bulk of the remaining content is directed to a guardian’s rights…and that content is the same across all pages.

Notice how this search factors here for “child injury trampoline lawyers”:

Here’s the content when you click through:

Notice the only mention of “trampoline” is that one bullet point? When Google crawls this content, it sees only a few lines of copy have changed and immediately think low-quality. One reason it probably sits at number four. I’m not surprised it doesn’t sit lower in the rankings.

This can also happen to large ecommerce sites when a product has multiple variations like size and color. The copy stays the same, just a few words change based on these variations.

Takeaway: The only truly suitable way to fix the geo-location or theme-based duplicate content is to re-write the content so it is fresh and original. That’s time-consuming and expensive, but you have to ask yourself…is it worth it not to be penalized?

The same holds true for the product pages of an ecommerce site, but it’s not always feasible to write fresh original content for each variation of a product…so create one and then block the dynamically-generated pages from getting crawled with meta noindex tags.

External Near-Duplicates

Affiliates and partners may sometimes grab content and product pages from your site. This creates near-duplicate content across the web, which will get similarly treated by Google as true duplicates a cross the web.

Takeaway: To beat out the borrowed content you have to beef up with unique content, usually done quite effectively and without much cost or time through user-generated content. Just two or three UGC pieces and a unique copy intro to your content will do the trick. You can focus on your top 10% best performing products and test the results.

Search Within Search

Large sites, especially ecommerce, tend to store pages and pages of internal search results. This means that when a user searches and on the SERPs is one of these search pages, Google will frown upon it.

The reason they frown on it is Google doesn’t want to deliver another search page to a user. It wants to deliver a relevant, useful page of information to the user. So if they spot these pages they will mark them as spam and penalize.

Takeaway: Obviously this doesn’t impact small or most medium-sized sites, so it may not be of interest to you unless you are responsible for a large site. To control this problem, then, you must block or noindex the pages from being crawled. It’s time consuming, but must be done.

High Structure Site Design, Low Copy Density

It seems like Google will also penalize those sites that put a heavy emphasis on a design template and neglect the copy portion.

For example, here is a page from WebMD.

The problem with this is it creates a space that is dominated by repeating themes like megafooters, excessive navigation, dynamic content or repeated images. Content quality could be high, but these signals indicate low-quality.

Takeaway: Write some original, detailed content that signals high quality content and tone down on the design template. I’ve learned a lot from my own experience with site design and testing it with you and have discovered that simplicity is king. Determine what your site design can live without and eliminate it. Test all of this, of course.

High Ad Ratio

This one and the last one are similar in that they are design related, but this one deserves its own section because it is a huge issue.

Webmasters want to make money off of their sites with the passive income that comes from ads…so they unfortunately think if one ad makes me $100 a month…then 10 should make me $1,000. The problem high ad ratio is a signal the Panda quality control panel indicated as noise…and gave it a thumbs down. This is even true if you are producing high-quality content. You may not think it is fair, but users prefer fewer ads…which means so does Google.

Takeaway: Limit the number of ads on your site to the top two or three highest performing. In fact, if you are tracking your ads you should know that not all of them are performing the same. Pull the ones that aren’t working.

Conclusion

When it comes down to it, you also should highly consider removing your lowest-performing pages. These are the pages with few if any incoming links, low visit rate or high bounce rate. You can rehab the pages if you want, but I found it easier to just scrap it and start again. Here’s how to identify low-performing pages.

And regarding duplicate content, you can use a couple of tools to help you identify it: Copyscape or Duplicate Content.

How has your site and content been impacted by Google Panda? Do you have duplicate content issues? And how did you handle them?

 

Neil Patel
Neil Patel is the co-founder of KISSmetrics, an analytics provider that helps companies make better business decisions. Neil also blogs about marketing and entrepreneurship at... Read Full Bio
Subscribe to SEJ!
Get our weekly newsletter from SEJ's Founder Loren Baker about the latest news in the industry!