Google Webmaster Tools: A Comprehensive Guide

  • 20K
Google Webmaster Tools: A Comprehensive Guide

Web site owners and marketers have always craved more information with regards to how Google sees their site. Google Webmaster Tools address these and other needs as they provide support for those looking to diagnose errors, improve a site’s visibility and declare preferences on how to handle web site listings.

Of course, before you can begin using Webmaster Tools, you will need to have an active Google Account. Once you have an account, you can get started by accessing:


Before you can get too far, you need to tell Google what sites you want to have controls over. Just enter the URL as prompted, and you’ll soon see that you need to verify your control of a site. You can choose to upload a blank HTML file, or, paste a META tag in your home page’s HEAD. Either way, it’s a quick way for Google to ensure that you have access to the domains before you can view any information.

The Overview

With your web site verified, you can now access a web site’s profile. To do so, you’ll want to click on the Manage: link in the Webmaster Tools Dashboard. You will then be directed to the “Overview” screen. This screen contains high level details of the most important information on your site as well as links to specific Webmaster Tools.

For the sake of clarity, we’ll work with the menu on the left, starting at the top and working our way down…


The Diagnostic tools are here to tell you about any errors that Google has encountered on its normal or on its mobile crawling sessions. In other words, if GoogleBot came through your site and found some 404 errors, pages that were excluded in a robots.txt, or other faults… this is where you will find out.

The mobile and standard reports provide the same types of information. That is a breakdown of the following errors:

  • HTTP errors
  • Not found
  • URLs not followed
  • URLs restricted by robots.txt
  • URLs timed out
  • Unreachable URLs

Obviously, you’ll want Google to encounter as few errors as possible. Therefore, you should check these reports often to make sure that no problems exist and that all new content is being spidered.


The statistical tools are the gold mine for site marketers. Here, you’ll gain access to a number of reports, each of which I’ll highlight below…

Top Search Queries

The top search queries report shows you how people are getting to your site from a Google Search. Once accessed, you’ll see tables that represent which search queries most often returned pages from your site, and which of them were clicked. One cool thing that many overlook here, is that Google will show you where in the SERPs your site was listed for particular search terms. It’s not always exact, but it does tend to accurate within a position or two.

What Googlebot Sees

This is a great way to learn how others link to you and how those links are considered alongside your on page content. Three sections exist on this report… phrases used in external links to your site (anchor text), keywords in your site’s content, and in inbound links, and finally the actual content on your site ordered by density.

Crawl Stats

The crawl stats report is less about actual crawling stats, and more about PageRank values. Google breaks your site down and shows you how your site’s PageRank is distributed on a range of Low, Medium, High – and “not yet assigned”. While this information is helpful, it’s often discouraging as most pages in most sites will always be seen as low on the PageRank scale.

The best part of this one tool is the bottom table that shows what specific page on your site carries the most PageRank. It’s updated monthly, and could be used to measure how successful singular link building campaigns are.

Index Stats

The index stats are nothing more than a shortcut to advanced Google queries on your site using operators. Shortcut links are provided on the following operators…

  • site:
  • link:
  • cache:
  • info:
  • related:

Subscriber Stats

Since much of the world now involved RSS feeds, readers and content syndication – Google makes stats on your feed subscriptions easy to find – assuming of course that you use their feed management systems.

This isn’t a bad tool, but it is new to the Toolbox and could certainly benefit from some updates. If you’re not already using an RSS feed manager though, it’s an excellent option.


Google is all about the links from page to page and domain to domain. The link reports here in Webmaster Tools are limited, but do provide you with ways to measure internal and external link popularity. This becomes handy if you’re trying to force increased exposure on new areas of your site, or, you’re looking for a way to stay on top of your link popularity over time.

Google Sitemaps

Google Sitemaps are what the entire Webmaster Tools were initially built around. Here you can upload and manage XML based sitemap files that catalog all of the pages on your site. Using these XML files, Google can then access all of the pages you’d like for them to.

Sitemaps have been around for a while now and are a much deeper issue that this article can do justice on. If you’re interested in reading more, I’d first refer you back to an article here on Search Engine Journal from July 2005 from Stephen Brennan.


Last, but certainly not least – is the tools area of Webmaster Tools. Seems redundant, right? Sure it is – but this is where you’ll get the most from Webmaster Tools, so stay tuned!

Analyze robots.txt

If you don’t already know about robots.txt, it’s basically a protocol for Googlebot and other spiders to immediately find instructions on what they can and cannot have access to on your site. If you don’t want spiders indexing your images, just disallow them. If you’d prefer not to have certain areas of your site indexed and available for the searching public – go ahead and restrict access.

Here though, you can check to make sure your robots.txt file is not only up to date, but also valid in terms of how it is written. Many times, a simple change on a robots.txt file will force Google and other engines to drop dozens or even hundreds of pages at a time. With that in mind, use this tool whenever you make changes to your robots.txt files!

Manage site verification

Remember the first step involving the META tags or blank HTML files? Well, here you can go back and alter how your site is verified – and even protect your site. Say for example your SEO provider enrolls you in Webmaster Tools, and then you pull the plug on their services. If not for this area of Webmaster Tools, your information would remain available to them forever.

Instead, Google’s actually watching your back and reporting if others aside from yourself have access.

Set crawl rate

This area of is very informative, as it provides an overview of Googlebot’s activity on your site. Many people view this area and figure it’s useless (unless the rare option to adjust crawling speed is enabled).

I’d like to offer up some tips though to make better use of the activity graphs.

First, let’s realize that you’ll be limited to a 90 day span. Second, think back to when you’ve recently launched new content, began new link campaigns or began hosting some rich media like videos and streaming audio.

All of these events may lead to spikes in the graphs. It’s these spikes that I really want you to review actively. For example, if you’re using new resources for backlinks, you can measure how responsive Google is by seeing how those spikes correspond to your efforts. The same holds true if you’re making any efforts at all to improve link popularity of any kind.

The end goal though with these, is to get Google coming in more often, requesting more pages and interacting more with you. That will allow you to get new content indexed and ranked more quickly in the future.

Set preferred domain

Tired of seeing and in your search results? Or, maybe you have become worried about canonical URLs and how they may be impacted your optimization and links? Well, the set preferred domain tool is one way to help police things a bit.

Using this tool you can instruct Google to display URLs according to your preference. Now you can have all listings appear as being on

Of course, if you’re not worried about this – you can also opt to have no associations set at all.

Enable enhanced image search

If you love to appear in the Google Image search results, you might be able to have some fun with this tool. It’s important to note though, that many have had a poor experience, ultimately losing out on a lot of traffic.

The core of the issue is this. Enhanced image search allows you greater control for labeling images that reside on your domain. While that sounds fun and exciting, Google’s trying to make a game of it. Literally.

Altering labels on images means that you are also using Google Image Labeler ( Image Labeler is a way to help improve image relevancy, but users are encouraged to use the labeler by earning points.

What can you do with the points? Nothing yet. Maybe there’s a perk associated with it down the line, but for now – my advice would be to stay away form enhanced image search until some more information is available.

Remove URLs

For some reason I love any tools that allow me to tell Google what to do. This automated tool is available to help resolve issues on pages that no longer exist or create problems for you by being in the Google index.

A great example is how people leave their servers wide open for Googlebot to explore. Common mistakes include having your web analytics indexed (folks with Webalizer, AWStats, etc. are commonly guilty vulnerable). Using this tool, you can go in, remove them from the index – and then go back and make the necessary edits on your robots.txt files.

In short, it’s a great way to expedite the removal of information that simply has to go.


While the above serves as a guide to each tool in the entire Webmaster Tools line, there are certainly ways to make the most of your experience with Google.

First, make note of all the information Google will allow you to download. Then, make sure you put it in your schedule to download all information on at least a 60 day cycle. Since Google is providing information on 90 days at a time, you’ll want to have historical reports to overlap and analyze.

As you use these tools more, you’ll find a number of new ways to review the data. I’ve spent days reviewing all of the inbound links to some of my domains, and suggest you take a few hours to do the same. You’ll learn a lot more about your site than you thought was available – and in the end – information is the power you need to be successful.