GET THE DISCOUNT
ADVERTISEMENT

Facebook Punishes Scraper Sites

  • 315
    SHARES
  • 1.3K
    READS
Facebook Punishes Scraper Sites
ADVERTISEMENT

Facebook announced a second wave of actions against spam. Facebook is targeting promotional posts that link to low quality ad farms that copy content from other sites. The announcement of this action was added to a pre-existing announcement on fighting Facebook spam.

What Facebook Announced:

“Starting today, we’re rolling out an update so people see fewer posts that link out to low-quality sites that predominantly copy and republish content from other sites without providing unique value. We are adjusting our Publisher Guidelines accordingly.”

What Facebook Told TechCrunch

TechCrunch reported on Facebook’s spam crackdown and published a statement representing what Facebook told TechCrunch:

“Today it exclusively told TechCrunch that it will show links less prominently in the News Feed if they have a combination of this new signal about content authenticity along with either clickbait headlines or landing pages overflowing with low-quality ads.”

How the Facebook Update Affects You

This update has far reaching implications. The posts Facebook is banning link to what is known as scraper sites, or simply scrapers. Scrapers are so-called because they use bots to “scrape” (copy) content from the web.

These bots can create heavy traffic on a website, resulting in Google’s own bot being unable to adequately crawl your website. This is one of the reasons Google Search Console shows 404 Page Not Found response codes.In these cases, Google may have tried crawling your site but the server was unable to server content because the amount of scraper bots on the site caused the server to be unable to show web pages.

Facebook’s action will have the effect of cutting off traffic, which will then impact the ability of these rogue sites to earn money from ad impressions. If Facebook is effective, this action may eventually have the effect of slowing down the amount of scraper traffic.

Slowing down scraper traffic is good because it allows a web server to operate under a normal load instead of a heavier load that may prevent Google from properly indexing the site.

Google is good about preventing “scraper sites” from ranking for actual search phrases. Scrapers tend to rank for extreme longtail and nonsensical phrases like snippets of text from the original website. However those kinds of searches of snippets from a web page give a false idea of how well Google ranks original content versus copied content. Judging Google based on nonsensical phrases like snippets from a web page sort of breaks how Google search works today. So the results aren’t necessarily representative of how well Google ranks original content.

Search engines are tuned to give results for a search query. User intent and how the query answer is expressed on web pages is core to how Google ranks websites. A web page snippet is not a search query. So those don’t accurately reveal how well Google excludes scraper sites.

Facebook’s algorithm has given scraper sites an opportunity to revive the business model of driving traffic to ad farms. But these updates have pushed them back down.

Images by Shutterstock, Modified by Author

CategoryNewsFacebook

Subscribe to SEJ

Get our daily newsletter from SEJ's Founder Loren Baker about the latest news in the industry!

Ebook
ADVERTISEMENT
ADVERTISEMENT

Roger Montti

Contact me via the "Email Author" button or click here to contact me at my site, MartiniBuster.com Roger Montti is ... [Read full bio]

Advertisement