SEO

X-Robots-Tag: A Simple Alternate For Robots .txt and Meta Tag

Before moving towards X-Robots-Tag, let’s have a small discussion on Robots. As webmasters, we already know about this term. It allows you to hide pages, folders, sub-domains or any content from a search engine spider.

There are two ways you can use robots.

Robots.txt: It is a text file where you can specify your content to be hidden from spiders. You can find it on root domain or sub-domain.

Syntax:

User-agent: *
Disallow:

Example: http://www.yourdomain.com/robots.txt

You can also add sitemap for a quick index purpose by search engines.

Robots Meta Tag: The robots meta tag gives you a page-specific approach to controlling how an individual page should be indexed and served to users in search results. You will have to put it <head> section of a given page.

Syntax:

robot meta tag X Robots Tag: A Simple Alternate For Robots .txt and Meta Tag
Now lets talk about the  X-Robots-Tag. First,  let me be clear that if you are using above robots.txt and robots meta tag for blocking your content, then there is no need to use this one.

What is X-Robots-Tag?

The X-Robots-Tag can be used as an element of the HTTP header response for a given URL of the web page.

How can I instruct crawlers not to index a page?

noindex X Robots Tag: A Simple Alternate For Robots .txt and Meta Tag
How can I show my page on search results if I am following some conditions?

nofollownoindex X Robots Tag: A Simple Alternate For Robots .txt and Meta Tag

Benefits of Using it:

  • You can use it where the usage of robots meta tags is not possible, for example: non-html files like Image, Video, and Flash can be blocked.
  • You can add the X-Robots-Tag to a site’s HTTP responses using .htaccess and httpd.conf files.
  • It is Global, which means an X-Robots-Tag with HTTP responses allows you to specify crawling directives that are applied globally across a site.
  • You can use regular expressions for high quality flexibility for complex url or content.
  • Like Robots Meta Tag, you can use other attributes for your page like nosnippet, noodp, notranslate etc.

If you do not want to use robots.txt or robots meta tag, or need to block non-HTML content, then use X-Robots-Tag. Check the announcement from Google about X-Robots-Tag here.

chandannew X Robots Tag: A Simple Alternate For Robots .txt and Meta Tag

Chandan Das

Owner and Co-founder at Clicks4info
Chandan Das is the Owner and Cofounder of Clicks4info, Who has interest in Blogging, SEO, SMO and also spend time on dancing, gaming and other funny things.
chandannew X Robots Tag: A Simple Alternate For Robots .txt and Meta Tag

You Might Also Like

Comments are closed.

6 thoughts on “X-Robots-Tag: A Simple Alternate For Robots .txt and Meta Tag

      1. I believe Robots.txt is respected if the bot comes in through the homepage, but if the bot follows a deep link from another site there is a chance that page can get indexed, so it makes sense to have both robots.txt and onpage meta robots or X-robots tag on the page as well. Not to mention that it protects you if your dev team accidentally deletes your robots.txt file which is always possible, or if you incorrectly code a disallow directive.

  1. Useful news but X-Robots-Tag is only effective after the page has been requested and the server responds, and the robots meta tag is only effective after the page has loaded, whereas robots.txt is effective before the page is requested. BTW, great writing Chandan!

  2. Sounds interesting but also seems like it might be harder to implement and update for a SEO who relies on a dev team to get code up on the site.

    It’s basically like a robots meta tag, i don’t see much of a difference. It might make sense to use this in place of a robots meta and accompanied with a robots.txt