5 Duplicate Content Issues for Magento

SMS Text

As the vast majority of our SEO clients are built on Magento, I’ve faced a huge amount of technical issues with the platform, most of which come down to duplicate content.

Here are five of the issues I’ve faced, along with ways to resolve them and prevent them from happening again in the future:

1) Duplicate content from dynamic filter pages

Dynamic filter pages are one of the most common technical issues that people face with Magento and site owners often find that thousands of them are indexed by search engines.

Example of a dynamic filter page:


These pages are used to filter product results on category pages.

Resolving the issue:

Having tried and tested a number of alternative fixes, we’ve found that using meta robots rules is by far the most effective way to eliminate this issue. This is the process we follow to get the pages removed from the Google index:

  • Map out the query strings that you’re looking to remove.
  • Apply meta robots rules (we use noindex, follow).
  • Start doing removal requests in Google Webmaster Tools.

Once you’ve done some removal requests, you should start to see Google taking note and taking other pages out quicker.

You can use our SEOpack Magento module to apply the meta robots rules automatically. This module also has several other useful features. Find out more here.

2) Duplicate content from Search Pages

Search pages finding their way into Google’s index is a very common issue for Magento users—the vast majority of Magento websites that I’ve looked at have this issue.

Resolving this issue:

Resolving the issue is actually really simple. You just need to disallow the directory in the robots.txt file. Usually the URL for search pages is ‘/catalogsearch/result/?q=test’, so if you disallow the Catalog Search directory and then remove it in Google Webmaster Tools, the issue will be fixed within around 8 hours.

3) Duplicate content from Review Pages

Review pages are interesting, because the issues can vary depending on how you structure your website and the plugin you’re using. We faced a duplicate content issue when we were displaying review content on product pages, but had a separate page for the same content—but I know that this is not always the case.

So, for the scenario that we faced, we removed the extra pages (one for each product), which looked like this:


In order to remove these pages, we simply disallowed the /review/ directory in the robots.txt file and then submitted a removal request for the folder.

Once again, review pages would only cause duplicate content issues if you’ve structured your website in the same way as we did, where you can go to a specific page to read product reviews, as well as on the product pages themselves.

4) Duplicate content from Pagination

The discussion of whether paginated versions of pages should be accessible to search engines had been going on for a number of years, but last year, Google announced the introduction of rel=next, prev, and all, allowing site owners to show search engines that these pages are in fact pagination.

So, my answer to eliminating the duplicate content issues caused by pagination is to implement these tags, which can also be simplified by using our SEOpack module.

5) Duplicate content from non-search friendly URLs

In previous versions, Magento used to employ non-search friendly URLs by default, and they often seem to pop up in a newer versions, too. URLs, like the example below, are commonly indexed by search engines and cause duplicate content issues.

These pages are unlikely to rank for anything or appear whilst navigating through the website; they’re caused by issues with rewrite rules.


I would recommend disallowing these pages in your robots.txt file and then doing a removal request for the folder in Google Webmaster Tools.

If you want to find about more about doing SEO on Magento websites, you can read my Magento SEO guide which I wrote on the GPMD blog.

Paul Rogers
Paul is an experienced technical SEO, who specialises in working on the Magento platform. Paul currently works in-house for a large UK-based gift retailer, whilst... Read Full Bio
Paul Rogers
Get the latest news from Search Engine Journal!
We value your privacy! See our policy here.
  • Angela

    That`s a big problem not only for magento, but for all ecommerce shops which can`t describe well their products and if we take a look and research – 1 product has a lot of dublicate description due to lack of content and copy/paste texts.

  • Tom

    I have encountered developers who have worked on some pretty big Magento projects that overlook these basic duplicate content issues. I have struggled for many hours cutting these pages from sites. My favourite was a Magento store with 20 products, a few blog posts, Google indexed over 10,000 pages. G+1’d your gpmd Magento guide, good read.

  • Shreyas – Convonix

    Hey Paul,

    Very helpful post . Keep up the good work. You have pointed out most of the duplicate content issues occurring on an e-commerce platform.
    I am a bit skeptical of the scalability of including all URLs in the robots file, because a lot of them are dynamically generated, and if you have even 50 products on your website, it will be quite some time for you to map out all of them. Also, for a webmaster to fully understand how & which search unfriendly URLs are being created on his website, & cached by Google is by looking into Google Webmaster Tools. My point is that WMT takes some time to update itself(I have seen a lag of 2 weeks at least) and the damage will be done till then. An alternative to this would be to dynamically generate URLs based on the path used to reach a page & then implement redirects – and importantly – this functionality should be visible only to browser user-agents. For crawlers, explicitly a single page can be shown.
    Honestly, I am not sure whether this functionality has already been integrated into the Magento CMS. But, if not, I am positive that it would help ecomm webmasters a lot.
    Would love to hear your thoughts.

  • Dan Kern

    Great article, Paul. I’ve been working with our company’s Development team this year to optimize our 15+ Magento sites, both technically and editorially, and we actually chose to go the “canonical URL” route on things like Review pages (set the canonical to the core product page URL) and Dynamic Filtering (set the canonical to the core category page URL). What I’ve noticed is that Google still has some of these Review and Dynamic Filter pages indexed. We’re going to explore additionally setting them to “noindex,follow” and blocking the directories via Webmaster Tools. A couple of our sites for example (in case you want to take a look): http://www.northlightshop.com & http://www.writersdigestshop.com.

    Thanks for the extra insight here 🙂

  • jay

    Thank you for these tips on fixing duplicate content issues with Magento!

  • Jaimie Sirovich

    This list looks correct, but the solutions are unfortunately wrong for most usecases on faceted search. Take a look at http://www.seomoz.org/blog/building-faceted-navigation-that-doesnt-suck for some other ideas. Some of our projects are cited, and most of the workable ideas for faceted search are enumerated. There’s not 1 canonical solution yet 🙁

    The other stuff looks right.

  • cray

    Hi, thanks for the good tips, I have a question, how to remove the directory of “Duplicate content from Search Pages” from google search result. Do I just need to put the /catalogsearch/result/ or http://mysite.com/catalogsearch/result/ in the remove request box?

    • cray

      I mean in the google webmaster tool

  • Derek

    WE have thousands of reviews and the company i am working for just added the top 10 reviews to the product page and a “see all” link to the review page. By doing this it makes it so the 10 reviews on the product page are not unique. I explained how we should show the top 10 reviews on the product page and then have link that says “see the rest” and you can see the rest but the same 10 reviews from the product page will not be their. What do you think about this? Also did you see a difference in rankings once you got rid of the duplicate content issues by the reviews?