SEO Dilemma: How to Get Rid of Hundreds of Short-Term Pages

SMS Text

We have once discussed a related questions of how to get rid of tons of expired pages and the only conclusion we have arrived at is that there is nothing easy in that question.

A recent WebmasterWorld thread is discussing a similar but even more complex question: "How to get rid of user-generated pages that expire daily."

It is quite clear why this problem desperately needs a solution:

  • Google crawls on a budget: hundreds of low-quality pages will take its time preventing it from looking for other (more important) pages;
  • Too many pages with almost no content and outdated data will result in wasted link juice;
  • Users taken to those pages will see outdated information which will result in bad user experience.

Still, those pages are not useless at all as users link to them (when for example announcing the upcoming event on their own sites).

So what’s the solution?

Using 301 redirect

This is the most obvious solution as it saves the link juice. But redirect where?

  • Setting up a separate page that would say something like "That page you attempted to enter has expired so keep looking" seems to be against user experience.
  • Redirecting to some dynamic page (search results) that might be similar to what that expired page was about may end up in those dynamic pages getting indexed which may seem strange and even spammy to Google.

Note that Google has made it clear that they don’t want to see search results indexed in their search results – so I would not suggest going in that direction for your expired content. Offering a lot of urls that are only dynamically generated search results based on the referer could result in penalties after a while.

Using 404 Page

Here’s another solution offered at the thread: let hose pages return 404 response code and et Google figure the situation by itself:

…it’s very likely that the backlinks you are concerned about will begin to lose power quite quickly, even if the webmaster leaves them online after the expiration date. So "squeezing too hard" on the potential link juice is probably not a worthwhile prospect.

So what would be your take?

Ann Smarty
Ann Smarty is the blogger and community manager at Internet Marketing Ninjas. Ann's expertise in blogging and tools serve as a base for her writing, tutorials and her guest blogging project,
Ann Smarty
Get the latest news from Search Engine Journal!
We value your privacy! See our policy here.
  • That’s a straightforward situation, I really don’t see anything special in it. First, stop the new pages from ever making it into the google index (robots.txt anyone?). Second – 404 the old pages with a friendly custom 404 page to minimize user frustration and to let Google drop the pages as quick as possible. Problem solved.

  • Someguy

    I would probably go for 301 than custom 404. I don’t know that sites architecture so I can not judge what would really make sense, but the target page of your 301 could be provided with noindex,follow for example…

  • That’s a big deal Anna. If think that whenever possible the old web page should be keep fresh with new data.
    I give you an example that looks working pretty fine on the tourism field. I setup specific time-frame related web page that periodically expire because the event is gone.
    What I do is leave the web page on the server, simply saying that offer is gone, but the next valid period, I edit again the web page, change the offer and the validity and push the new content in.

    The results is that those web page are very well indexed for the keywords I need.

    Note that I don’t use the meta tag expires, otherwise I will probably get the web page “penalized” at the end of the offer.

  • The tourism example is quite different from what is at hand – user generated content that once expired has nothing to replace it…

  • as far as short pages concerned, i like to put more relevant content on that short term pages. if your website has some page is talking about 2008, in that page you can put more for 2009 and other offers.this idea will click as older page has more visits than newer page.

    Another idea is to put this meta tags on short term pages..

  • Putting unavailable_after meta tags.. like META NAME=”GOOGLEBOT” CONTENT=”unavailable_after: “25-Aug-2007 15:00:00 EST”

  • I agree with Dipali…

  • Why would you want to have pages that exist for just one day to ever get into the search engines? That’s where I’m lost. And in that scenario, I agree with the noindex,nofollow / robots.txt disallow method.

  • hmm that is such a inforamtive post Ann, and thanks deepali for the code “Putting unavailable_after meta tags.. like META NAME=”GOOGLEBOT” CONTENT=”unavailable_after: “25-Aug-2007 15:00:00 EST”

  • @Geo it mostly depends on what the UGC talk about. If they are pure personal experience probably it will be.