1. SEJ
  2.  ⋅ 
  3. SEO

20 Ways to Get Deindexed by Google

Why does Google remove some content from its index? These 20 practices can explain why Google chooses not to display some webpages.

Why does Google remove content from its index?

Google chooses to exclude some webpages because not every optimization is a good one, and some content just doesn’t provide a good answer for searchers.

You may be accidentally publishing spam pages in pursuit of SEO or trying to deceive Google’s algorithm.

In this column, you’ll learn more about 20 different ways you might find your site deindexed by Google, including:

  1. Crawl Blocking Through Robots.txt File.
  2. Spammy Pages.
  3. Keyword Stuffing.
  4. Duplicate Content.
  5. Auto-Generated Content.
  6. Cloaking.
  7. Sneaky Redirects.
  8. Phishing and Malware Setup.
  9. User-Generated Spam.
  10. Link Schemes.
  11. Low-Quality Content.
  12. Hidden Text or Links.
  13. Doorway Pages.
  14. Scraped Content.
  15. Low-Value Affiliate Programs.
  16. Poor Guest Posts.
  17. Spammy Structured Data Markup.
  18. Automated Queries.
  19. Excluding Webpages in Your Sitemap.
  20. Hacked Content.

20 Practices To Avoid To Prevent Being Deindexed by Google Search

Certain SEO techniques can remove your website from Google search. Here are the 20 schemes to avoid so you can rank on the SERPs:

1.  Crawl Blocking Through Robots.txt File

You end up removing your URL from Google’s search result pages (SERPs) yourself if you have a crawl block in your robots.txt file.

Page Cannot Be Crawled or Displayed Due to robots.txt

“Page cannot be crawled or displayed due to robots.txt” is a standard error message that appears when your web pages are not crawlable.

If you didn’t want the page blocked, update your robots.txt file so Google crawlers know to index the page.

To do this, go to the robots.txt file in the root directory of your website:

Ensure that your robots.txt file looks like this:

User-agent: Googlebot

Instead of:

User-agent: Googlebot
Disallow: /

2. Spammy Pages

Did you know that Google finds over 25 billion spammy pages every day?

There are several spam mechanisms Google finds on various websites. According to Google’s 2019 Webspam report, link spam, user-generated spam, and spam on hacked websites are the top three spam trends.

If you create suspicious pages to trick users and search engines or leave your comment section unprotected against user-generated spam, you risk removing your URL from Google search results.

3. Keyword Stuffing

Keyword stuffing refers to the irrelevant and excessive placement of a specific keyword throughout a content piece.

While keyword stuffing might appear an easy way to increase your rankings, you also risk having Google remove your website from search results.

Mention your keywords naturally in places like your page URL, post title, metadata, introduction, subheadings, conclusion, and scantily within the body.

Overall, each keyword placement should have a relevant context.

4. Duplicate Content

Google does not condone duplicate content, whether you copy other websites’ content or reuse the content of your webpages.

Google removes content that is plagiarized from the SERPs.

To avoid that, create unique and relevant content in line with search engine rules.

If you must include duplicate content pages on your website, use the x-robot and add a noindex tag and nofollow HTML meta tag.

5. Auto-Generated Content

A lot of website owners are the Chief-Everything officers of their businesses and therefore have little or no time for content creation.

Article spinners might be tempting as a quick solution. However, using article spinners might get your content removed from search results.

Google removes content that is auto-generated because it:

  • Focuses on replacing keywords with synonyms.
  • Adds little to no value to readers.
  • Contains errors and lacks context.

6. Cloaking

Cloaking is a violation of Google’s rules. It will get your website removed from Google search.

In cloaking, content delivery depends on “who” the user agent is. For example, a webpage may display text to a search engine bot and images to a human user.

In other words, a website’s visitors might see images, or even malicious content, while search engines like Google and Bing will see search optimized content.

7. Sneaky Redirects

Google penalizes sneaky redirects as t displays different content to human users than what was forwarded to search engines – similarly to cloaking.

You risk removing your URL from Google if your redirect is a manipulative move.

Nonetheless, you can use redirects for sending a user to:

  • Updated website address.
  • URL containing merged pages.

8. Phishing and Malware Setup

Google forbids cybercrimes, whether phishing or setting up malware like trojans and computer viruses.

Google’s content removal activates if you create malicious webpages to:

  • Gain unsolicited access to users’ sensitive information.
  • Hijack user system functions.
  • Corrupt or delete essential data.
  • Track users’ computer activity.

9. User-Generated Spam

While user-generated spam might appear on high-ranking websites, excessive user-generated content can lead to Google removing your URL from Google search results.

This practice is common on platforms that allow users to access tools and plugins to create their accounts or add comments.

Common examples of this spam include comment spam on blogs and forum spam – with malicious bots spamming the forum with links to viruses and malware.

10. Link Schemes

Link schemes include the act of soliciting link exchanges to increase the number of backlinks and, ultimately, search rankings.

These manipulative link-building practices such as link farms, private blog networks, and link directories violate Google’s SEO guidelines.

Google disapproves of:

  • Paid links for manipulation of search results.
  • Low-quality link directories.
  • Invisible links in the footers.
  • Comments and signatures on forums with keyword-stuffed links.

11. Low-Quality Content

Creating low-quality content can see your content removed from Google Search faster than you think.

You shouldn’t post irrelevant, meaningless, or plagiarized content for keyword ranking or consistency’s sake. Take time to write high-quality and original posts that your audience will find helpful.

12. Hidden Text or Links

Steer clear of using hidden text or links to boost your rankings. It violates Google’s rules and might lead to the removal your URL from Google.

Google removes content containing text or links that:

  • Seem impossible to read.
  • Hide behind an image.
  • Match the website background color.

13. Doorway Pages

Doorways, also known as portal or bridge pages, are related websites or pages that rank for specific search terms but lead to the same destination once you click.

Google penalizes users for doorway pages because the sole aim is to gather huge traffic to a webpage while deceiving users with varying search results.

14. Scraped Content

Some website owners drag content from high authority websites to their websites with little to no modification in content. Even if they do modify the content, they do so by replacing the words with their synonyms.

While scraped content might disguise as curated content, it violates Google’s Webmaster guidelines and can result in the removal of your website from Google search since it:

  • Carries no originality.
  • Results in copyright infringement.

15. Low-Value Affiliate Programs

On your WordPress website, you may be running affiliate programs while simply posting the descriptions of the promoted products you find on other platforms. Google considers this behavior a poor content marketing effort, and can remove your URL from Google search as a result.

In general, Google removes the content of thin affiliate pages from appearing on the SERPs due to low-quality content.

16. Poor Guest Posts

Guest posting is a good SEO habit when done right.

On the other hand, if you don’t set strict guidelines and are publishing low-quality guest posts that link to spammy blogs, Google can deindex and remove your website from search.

17. Spammy Structured Data Markup

Google’s structured data guidelines state that you must steer clear of misleading or spammy markup to avoid getting a penalty.

Google determines whether a URL will show up in search results and rich snippets using data markup. If it finds irrelevant, manipulative, hidden, or danger-inclined content on your website, Google may remove that content from its index.

18. Automated Queries

Sending automated queries from your website to Google can earn you a penalty.

Avoid sending queries from bots or automated service to Google to see how your website rank. It violates Webmaster Guidelines, and Google might deindex and remove your URL from Google search.

19. Excluding Webpages in Your Sitemap

Like metal to a magnet, search engine bots are attracted to sitemaps.

Your sitemap helps Google understand your website at a glance by:

  • Providing an overview of pages and their importance.
  • Displaying details of images, videos, and news.
  • Showing how your content is interlinked.

To remove URLs from Google search results, you can exclude the webpages from the sitemap that you don’t want Google indexing. You should still block the page using robots.txt if you really don’t want Google finding and indexing it.

Also, you can check your Google Search Console account to see how your sitemap performs.

20. Hacked Content

Hacked content is a cybersecurity concern. It refers to any content found on your website without your consent – added through a security backdoor – to attack users’ privacy or resources.

Like website malware, hacked content can result in the removal of your website from Google search. Google removes content like this from search results to ensure users’ safe browsing.

Final Thoughts

Don’t inadvertently remove your website from Google search by trying every SEO technique you find on the web. Avoid these 20 practices we’ve mentioned above — unless you’re looking to have specific pages excluded from the index.

Google removes content that falls short of its guidelines. Stick to the rules and create quality content that addresses the searchers’ intents to keep growing your site’s presence in search.

More Resources:

Category SEO
VIP CONTRIBUTOR Maddy Osman SEO Content Strategist at The Blogsmith

Maddy Osman is the author of “Writing for Humans and Robots: The New Rules of Content Style” (learn more: ...

20 Ways to Get Deindexed by Google

Subscribe To Our Newsletter.

Conquer your day with daily search marketing news.