SEO

Matt Cutts Says Do Not Block Googlebot From Crawling JavaScript and CSS

Matt starts by saying that this video is a short “Public Service Announcement” so I think you should pay close attention to everything he says. Matt is going to tell you what to do and why he is offering this PSA. Please listen and share with other SEO’s. web designers and developers.

 Matt Cutts Says Do Not Block Googlebot From Crawling JavaScript and CSS

Melissa Fach

SEJ Editor - Melissa is the owner of SEO Aware, LLC. She is a consultant and trainer helping companies make the most of their content marketing and SEO. She specializes is the Psychology behind blogging and content marketing. Melissa is also an associate on the Community team at Moz, an associate and writer at CopyPress and an editor at Authority Labs. She is a self-proclaimed Star Wars and Internet geek and volunteers with big cats at BigCatHabitat.org.
 Matt Cutts Says Do Not Block Googlebot From Crawling JavaScript and CSS

Latest posts by Melissa Fach (see all)

You Might Also Like

Comments are closed.

6 thoughts on “Matt Cutts Says Do Not Block Googlebot From Crawling JavaScript and CSS

  1. It was posted in August 2011 but has fooled everyone (including me) into thinking it was news. Doesn’t change the message of course though.

  2. The purpose of including CSS in the crawling path is to let Google know what is important. Sometimes heading tags are used to declare importance of certain words and by merely crawling the website one can not tell which are the important words. There was a time when the keyword density was more important and people used to stuff keywords into the content. But today it becomes clear that even while repeating some keywords teh web page may be really emphasis something other in heading tag.

  3. This is actually really good information for anyone that didn’t already know. If you’re using a plugin to hide / show content and putting it in your js folder, then blocking robots.txt, you’re also blocking that content from appearing.

    There really is no down side to unblocking – Google will crawl it regardless.

  4. you are right @Pavlicko .good information for anyone that didn’t already know. If you’re using a plugin to hide / show content and putting it in your js folder, then blocking robots.txt, you’re also blocking that content.

  5. Well its obvious what he wants to make sure people don’t use javascript and CSS for hidden text and other black hat methods you can use with JS and CSS.

  6. I have always blocked images, css and js through robots.txt for all crawlers. But yesterday I took Matts recommendation and opened the flood gates. Lets see what happens to my SERPs over the next couple of weeks….

    - worried webmaster