Five Hidden Gems in Google Webmaster Tools

Within Google Webmaster Tools, hidden gems abound. But, there are five very special hidden gems that will help make your life as an SEO easier.

Most webmasters are familiar with Google Webmaster tools. It is a toolbox created by Google to help uncover technical issues. In fact, if you are considering buying another analytic tool, I highly recommended you explore GWT first. You may not need another tool at all if the issues on your site are restricted to those found in GWT.

Fetch as Googlebot

Fetch as Googlebot

Are you tired of waiting for Google to come back and crawl your site after an update? Enter Fetch as Googlebot. This little gem allows you to have Google re-crawl a page at your demand. It also allows you to see any web page just as Google sees it. It’s a great tool to find and identify problematic pages. For example, if your site has been hacked, and is being used to display information that is part of another website (a form of cloaking – not to be confused with iFrames), then Fetch as Googlebot allows you to see this. This tool allows you to fetch up to 500 URLs per week.

Advanced Data in Index Status

Advanced data in index status
While the basic tab will display the total pages in Google’s index, this information is basic and won’t help you identify all crawl issues. To identify more crawl issues, we need to go into the Advanced tab in Crawl > Index Status. Click on Advanced, click on the data you want, and then click on Update to refresh the page with the data. You will all the URLs on your site Google has ever crawled, as well as URLs that Google is unable to crawl due to robots.txt. If you are experiencing ranking and indexation problems, these three items will be able to help you pinpoint issues on your site that arise due to problem pages.

When using this data, it is important to take a look at your indexed pages. Say you know your site has 1,343 pages (including your blog), and a pages indexed number reads 4,533, you may have issues with Google indexing the search results pages from the on site search plug-in you installed. This can lead to serious duplicate content issues that can lead to serious ranking problems.

Get Back Your Keyword Data With Google Custom Search

Google Custom Search
If you are still upset and disturbed by the removal of Google referral data from Google Analytics due to “not provided” there is a simple way around this (although not exactly reflective of Google’s actual search data). Install Google custom search on your site. Using the custom search section in Other Resources in webmaster tools,  you can build a completely tailored search experience on your site. This allows you to see who is searching what on your site, and you can use that data to make different decisions about your search campaigns. This is a great way to recover some of that keyword data, as it helps you identify what your users are really searching for on your site.

Blocked URLs

Blocked URLs

This section is especially useful for uncovering crawl issues due to errors in the website’s robots.txt file. For example this tool will help you discover if someone created a robots.txt file that unintentionally blocks pages due to typos in the commands. By utilizing this hidden gem, you can uncover simple (and sometimes more complex) crawl issues that may arise due to the improper use of robots.txt directives.

Context: The Last Hidden Gem of GWT

One last hidden gem in Google Webmaster Tools in your quest to identify and correct website errors is understanding the real world context of the problems reported by the tool. While problems with indexation can reveal issues like too many indexed pages be careful there isn’t another site or design issue that is interfering with this.

Find out about these issues by performing your due diligence and investigating problems you aren’t sure about. Ask questions. And follow-up with your web developer or IT Director if there are major issues. Context can mean the difference between getting issues fixed on time and within budget or just creating more problems.

As you learn the intricacies of GWT it will be easier to use and you will be able to identify issues more quickly. These suggestions are just the tips of the iceberg. But, they can help you identify some serious site issues with out having to spend more money on additional tools.


Featured Image image credit: Shutterstock.com. Used under license.
All screenshots are original and created by this post’s author.

Brian Harnish

Brian Harnish

Since 1998, Brian Harnish has been building websites. Brian is a professional SEO with web design and web development skills. His expertise in nearly all areas of web design, web development, and SEO serve as the foundation for his own blog at http://www.brianharnish.com/. Brian now works with James Publishing & Attorney Marketing in Costa Mesa, CA as an SEO Specialist.
Brian Harnish
Brian Harnish

Latest posts by Brian Harnish (see all)

Comments are closed.

23 thoughts on “Five Hidden Gems in Google Webmaster Tools

    1. Hi Phil,

      It is the most up-to-date version of Webmaster Tools. Nothing should be different unless Google has updated GWT between the time I took the screenshots, and the time of the publishing of this post. :)

  1. I don’t know that the keyword data is quite the same, but it works if your site is large enough and has multiple sections with searching within. Most business websites though wouldn’t apply. Great stuff and nice use of screenshots throughout though!

    1. Justin – you are correct! The keyword data will be different than Google’s keyword data. For those sites who have a lot of traffic and people who are actively searching, this tool can be a gold mine of information. Thank you for the feedback Justin! I’m glad you enjoyed my post. :)

  2. “Fetch as Google” & “Custom Search” have been my top fav. As a force of habit as soon as I update any post, I use “Fetch” and MAKE Google crawl my link. In 15 min, I can see my link in SERPs. Pretty neat!!!

    1. Preeti – Yep! It has been my GWT tool of choice for awhile now. I love fetch as Google for its seemingly endless spider-on-demand capabilities. :)

  3. Nice post Brian, thanks for the share,

    I did verify my website with the Google webmaster tool like you explain under Google index with the advanced option and it says : Total indexed
    Blocked by robots
    But if I go check the sitemap stats it does shows 54 submitted and 49 indexed so not sure why it says zero indexed in the Google index option, any feedback from you Brian or anybody who has stats showing this way.

    Thanks a bunch

    Great share


    1. John
      The line graph probably just has not caught up to the bar graph in the sitemap section. The line graph only updates about once a week. Try searching site:yoursite.com

      1. John – Grant is correct in that it probably has not been updated. Aside from that, I would also double check your robots.txt file to ensure that you don’t have any code that is blocking indexation. Here is a brief listing of robots.txt file commands. WARNING: If you are not adept at editing robots.txt I STRONGLY advise having an expert look at that file and make sure that you aren’t blocking indexation, especially unintentionally: http://www.robotstxt.org/robotstxt.html. Otherwise, you can cause undesirable side effects.

  4. The biggest problem I have on Webmaster tools, not filtering local listing rankings from regular rankings. I would like to see regular rankings w/o local listing rankings. It is pretty misleading when you check your average rankings.

  5. His major crisis I have on Webmaster tackle, not filter local list ranking from usual ranking. I would like to observe normal ranking w/o local list rankings. It is attractive confusing when you ensure your regular rankings.

  6. Hi brian,

    I am enjoying while read your article. I have one website run from over a year, Last week found out of 90 pages 35 pages are not indexed in Google ( Shows in webmaster tool ). I fetch all the pages from there but it won’t work so much. Can you give me a suggestion how to solve this issue ?

    1. Hi Ram,

      Thank you! I’m glad you’re enjoying my article. I would do several things: 1. Make sure the pages are not somehow blocked in robots.txt. 2. Make sure you don’t have a noindex,nofollow directive in Meta robots tag on the actual page you’re trying to see is indexed (e.g. ). 3. If you have both an html and XML sitemap make sure both are updated to show all 90 pages you want indexed. 4. It can take several days for Google to index the pages as well. 5. Try linking to that page from a high PR frequently-crawled domain as well. 6. If you continue to have indexation issues, I would consider having a thorough review of your site performed by an expert to ensure there are no other crawl issues preventing indexing of your pages.

  7. Hi Brian,
    Great article!

    Please help, I have two questions:
    1. Is it safe to use “Fetch as Google” function everyday?
    2. I have changed some slugs/URLs of my website. And I have tried to delete the old slugs with “Remove URLs”
    tool, but they looks still detected by Google. Any suggestion?

    Thank you

    1. Hi Andy,

      Thank you!!

      1. I have seen no issues from using Fetch as Google every day. In fact, you can use fetch as Google to submit up to 500 URLs a week (that is Google’s official limit). Assuming that your pages are of high quality and all meet Google’s guidelines (on-site, and off-site), I don’t think you would have an issue. Don’t quote me on this, however, because this is only my experience and it could vary depending on how many pages you fetch per day.

      2. For this, assuming you no longer need the old URLs, I would use a 301 redirect to permanently redirect the old URLs to the new ones. In time, Google should finally see only the new URL as the primary URL for that page.

  8. Hi, great article!
    As of yesterday it seems that in Webmaster Tools I have no indexed pages (in the Sitemaps section of the tool). However, I get all my pages in Google search when using “site:”. Do you have an explanation for this? Is it something that I should be worried about?

    And one more question. What is considered an optimal sitemap submission frequency? Is it ok with Google if I submit my sitemap once a week? What about Google News sitemaps? Could I submit my news sitemap two or three times a day without being penalized?
    Thanx again for a great post.