SEO

Pagination and Duplicate Content Issues

Very often with large dynamic sites webmasters just can’t do without pagination. Apart from usability here they come across another serious issue: duplicate content.

The thing is that all pages normally have identical titles and meta description and while some SEOs think it’s not an issue at all (as search engines “don’t see identical titles as duplicate content”), I do believe that this may at least result in inadequate site crawling rate and depth.

The problem may have a number of solutions none of which is unfortunately perfect:

Solution Drawback

 

Add a different portion of the title/ description to the beginning.

e.g. <title>A-G Blue Widgets</title>

Not in each case this unique element can be sorted out with pagination (most often that will be just page # variable).

Put all the content in one html document. Then use JavaScript to create pagination without reloading the page.

Can be done only if there are not too many results to list; otherwise the page will be enormous and search engines won’t be able to crawl all of it.

 

Add NoIndex meta tag to the each page except the first one to keep search engines from indexing the pages but still allow them to crawl and follow the links.

<META NAME=”ROBOTS” CONTENT=”NOINDEX, FOLLOW”>

Naturally, none of the pages except the first one will ever be ranked and this still doesn’t solve the “PageRank leak” problem (i.e. to many links to follow and give weight).

Now, while each of these solutions has its pros and cons, I have a question to you: how do you handle these issues:

  1. Do you try to avoid pagination at all?
  2. Do you believe pagination doesn’t create duplicate content?
  3. Can you think of any other solution not listed here?
f8d69258525dec38624a29eb3d570d8c 64 Pagination and Duplicate Content Issues
Ann Smarty is the blogger and community manager at Internet Marketing Ninjas. Ann's expertise in blogging and tools serve as a base for her writing, tutorials and her guest blogging project, MyBlogGuest.com.
f8d69258525dec38624a29eb3d570d8c 64 Pagination and Duplicate Content Issues

You Might Also Like

Comments are closed.

8 thoughts on “Pagination and Duplicate Content Issues

  1. There is a lot of duplicate content that we experienced right now.Some of the link value of your site will be spent on pages that don’t get indexed. If you eliminate the duplicate content, this link power will be distributed only among pages that will get indexed, resulting in a potential improvement in the ranking of those pages.

  2. This type of duplicate content does not incur penalties in search results. It usually only affects queries where the titles and/or meta descriptions are similar enough that the search engine may omit some of the pages (typically in SITE searches).

    For people who expect their users to use a major engine for site search, this is a major duplicate content problem. You absolutely need unique titles and meta descriptions for effective third-party site search.

  3. Sites that use the pagination technique for page views may be fooling themselves. I just worked on a site that has thousands of page 2s+ but when I looked at the analytics I found that these extra pages only accounted for 2% of their page views and <0.5% of their entry pages. It didn’t seem like it would be a big loss to get rid of these and the SEO and user benfits of putting everything on one page might be a lot greater.

  4. Ann,

    I have successfully used method #1 for some time. Initially, I was worried that pages would take too long to load if there was too many products. However, I found through a/b testing that visitors were willing to wait. In fact, I’ve successfully used this method on product category pages with over 100 products.

    More and more I’m finding that this method is preferable not just from an SEO view, but for website usability. Paging irritates people, as they have to navigate those tiny 1,2,3… links and they sometimes miss them all together.

  5. Hi All,

    I use method #1 too, and think pagination is a good thing.

    One doesn’t have to use the tiny 1,2,3 system (which is a pain indeed), rather use text anchor which will suit both usability and SEO (ie: “Next blue widgets”).

    I find myself discouraged when confronted to 10′s or 100′s of results. Moreover, I think there are lots of ways (user vote for instance) to sort your items so the more relevant are grouped on the first page.

    Regards,

  6. Hey Ann,

    Those methods may not be perfect but they are efficient..
    Duplicate content and meta tags are a huge problems to website owners..
    One has to figure out an efficient way of solving this..
    And about the post, its a really informative and impressive article..