1. SEJ
  2.  ⋅ 
  3. Technical SEO

Google Removes Robots.txt Guidance For Blocking Auto-Translated Pages

Google removes robots.txt guidance for blocking auto-translated pages. This change aligns Google's technical documents with its spam policies.

  • Google removed guidance advising websites to block auto-translated pages via robots.txt.
  • This aligns with Google's policies that judge content by user value, not creation method.
  • Use meta tags like "noindex" for low-quality translations instead of sitewide exclusions.
Google Removes Robots.txt Guidance For Blocking Auto-Translated Pages

Google has updated its documentation, removing advice that suggested using robots.txt to block automatically translated pages from search results.

This change aligns Google’s technical docs with spam policies introduced over a year ago.

“This is a docs-only change, no change in behavior,” Google clarified in its Search Central changelog.

Why This Matters

Removing a few lines from the documentation may seem minor, but it shows Google’s view on automated content is changing.

Google removed the guidance because it had become outdated after rolling out “scaled content abuse” policies last year.

These policies evaluate content based on the value it provides, regardless of how it was created.

For websites with multilingual content, here’s what this means.

Old Approach

  • Block auto-translated content via robots.txt
  • Avoid indexing automated content

New Approach

  • Evaluate translation quality case by case
  • Focus on user value over creation method
  • Use page-level controls (like meta robots tags) instead of blanket blocks

Note: While Google never officially stated that all machine translations were spam, earlier guidance leaned toward blocking them by default. The new policies encourage more nuanced evaluation.

Related: Google On Scaled Content: “It’s Going To Be An Issue”

What to Do Now

While Google doesn’t recommend a behavior change, it’s worth considering these steps:

  • Review your robots.txt: Remove outdated rules blocking translated content if the translations serve real user needs.
  • Set quality standards: Not all machine translations are equal. Keep the good ones, noindex the bad ones.
  • Think user-first: Ask whether your translated content genuinely helps international visitors or just expands keyword coverage.
  • Reinforce page-level control: Prefer meta tags like noindex for low-quality translations over sitewide robots.txt exclusions.

See also: Google On Robots.txt: When To Use Noindex vs. Disallow

The Takeaway

This documentation change may seem small, but it shows how Google’s views can shift over time.

For SEO professionals managing multilingual sites, this is a reminder to stay adaptable and focus on what helps your users.


Featured Image: Roman Samborskyi/Shutterstock

Category News Technical SEO
SEJ STAFF Matt G. Southern Senior News Writer at Search Engine Journal

Matt G. Southern, Senior News Writer, has been with Search Engine Journal since 2013. With a bachelor’s degree in communications, ...