Screen Shot 2014-02-27 at 7.50.11 PM
Matt Cutts

Matt Cutts Announces Tool For Reporting Scraped Content Ranking Better Than Original Content

Matt Cutts, Google’s head of search spam, made an announcement on Twitter today about a new tool developed to let Google know when you find scraped content ranking better than original content.

Simply called Scraper Report, the new tool allows you to alert Google about these instance by filling out three fields.All you have to do is provide the URL of the page where the content was taken from, the URL where the scraped content appears, the the search result URL that demonstrates the problem. You also have to check a box confirming your site is following Google’s Webmaster Guidelines and is not affected by manual actions.

It’s important to note that the Scraper Report tool does not promise any kind of fix, or give any indication of what Google intends to do with the information you submit to them. Since this news was reported by Matt Cutts, it can be assumed that the information reported will be sent to the webspam team for consideration as a spam offense.

Whether or not the offending content will be removed is uncertain, but at the very least this information will hopefully be used to help the spam team improve Google’s algorithm so original content ranks above scraped content.

 

 Matt Cutts Announces Tool For Reporting Scraped Content Ranking Better Than Original Content

Matt Southern

Freelance Writer at MattSouthern.com
Matt Southern is the lead news writer at Search Engine Journal. His passion for helping people in all aspects of online marketing flows through in the expert articles he contributes to many well respected publications across the web. Contact him via his website if you'd like him to write for you.
 Matt Cutts Announces Tool For Reporting Scraped Content Ranking Better Than Original Content

Comments are closed.

15 thoughts on “Matt Cutts Announces Tool For Reporting Scraped Content Ranking Better Than Original Content

  1. I like that – I’ve got an example of a site that I can test this with rightaway. They have been scraping my RSS feed for years now and using a link like ‘read more’, ‘find out more’ etc at the end of the content. I think they are using that old AutoBlogger WordPress plugin, even in its day it was pretty crude. I’ll be back with my findings!

    1. Yes Rick, most of the blogger use our content and put a link of “read more” in the last but the actually they must include more information or content related to that and put some anchor link in between the content .

  2. I think that’s a good move. Coming up with good quality and original content is often hard work. I can fully understand why this can frustrate site owners and bloggers.

  3. It’s a smart tool. My only concern with it is someone using it as a weapon against the original owner. What if someone scraped the content and the reported the original site a couple days later?

    I know there are ways to discover who the original content producer is, but will Google leave room for dispute? Or will they just go ahead and penalize whoever was reported?

  4. I really appreciate this step which is taken by the Matt Cutts. But I want to ask a question. As someone can do the black-hat SEO of any website and no one can stop. If somebody complains about the scrappy content, then how Google will see whether the claimer is right or wrong?

  5. I think this is a great step taken by Google in order to maintain the originality of the content. This will also improve the ranking of the websites containing the original content.

  6. In your option, does this apply to direct duplicate content only (a blatant copy and paste), or also to spun articles (original article summarising another article using different worlds)?