SEO

Google XML Sitemaps : Essential FAQ

A SEJ reader sent in the following question:

Is it a must to create a sitemap for your site and submit it to Google? Can it harm?

This is an often asked question, surprisingly. Webmaster forums keep discussing horror stories how their site traffic dropped after they created a sitemap and submitted it to Google Webmaster Tools. So here I am giving a list of frequently asked sitemap-related questions to cover all this FUD.

Is a sitemap a must for a newly launched site?

No, decent link structure and well-formed semantic structure should come in the first place.

Does a Sitemap help in indexing new sites?

The only role of a sitemap is to discover new sites and new URLs. This doesn’t mean Google will spider or index them. However there have been rumors that sitemaps might help in faster crawling and indexing and even in long-tail ranking.

If a page is not included in a sitemap, will that encourage Google to devalue it or even drop out of index?

No, a sitemap only gives Google additional information on an URL structure; it doesn’t mean Google will crawl only your sitemap once you submit it to your Webmaster tools.

Are there still any tricks to speed up indexing with a sitemap?

It’s might be up for debate, but higher <changefreq> settings might help the site in faster indexing and better rankings.

What format is preferable if I want to submit my Sitemap to Google Webmaster Tools?

If you have less than 5000 URLs in your sitemap, both compressed and non-compressed formats will do.

Do I need to include any other type of content (e.g. image or video files) to my sitemap.

The general rule of thumb is to keep specific content that can be ranked. That said, as you don’t (and can’t) expect images to be ranked in general search, don’t include them in your general sitemap. It is wise to use different sitemap format for other types of search results:

Can sitemaps be harmful to your site?

Rumor has it, after adding a sitemap to Webmaster Tools, you might see more pages in index but much less traffic.

Besides, sitemaps are reported to make scrapers’ life easier by providing a clear website structure. What’s more, some free sitemap generators seem to be saving and then using the generated data.

Conclusion? I for one have never had any bad experience with sitemaps (though I keep reading about it now and then and agree there is probably no smoke without the fire). On the other hand, I haven’t seen any major effect either. Still I don’t think Google is so evil that it offers a tool and than turns it for bad (call me naive).

 Google XML Sitemaps : Essential FAQ
Ann Smarty is the blogger and community manager at Internet Marketing Ninjas. Ann's expertise in blogging and tools serve as a base for her writing, tutorials and her guest blogging project, MyBlogGuest.com.
 Google XML Sitemaps : Essential FAQ

Comments are closed.

13 thoughts on “Google XML Sitemaps : Essential FAQ

  1. I can’t tell whether or not using sitemaps affects indexing and ranking. Normally, googlebot is ahead of me with indexing new pages (I manually create and add sitemaps), so it feels somewhat useless. Nevertheless, I fear that not using them can harm indexing. I hope someone has done some sort of a/b-comparison to see it is smart, stupid or useless to do this.

  2. Ann, thanks for sharing your insight on sitemaps. You’re right, it’s highly unlikely (not impossible) that Google would risk offering a tool that adds no value to users.

  3. You do not need any sitemap. What for? Google bots will be the first to find out that you have made a new page. Second: with proper inner linking and with some inbound links google will crawl and index your page. It would not speed up matters at all – same is the case with add url to google. Let google do things the regular way…

  4. Nice Tips and i wonder if blog like blogger.com has their own sitemaps to index the site. Sorry i my Q are too basic. I just try to understand the important of xml site maps.

  5. I have not seen any effect for sites that have or have not got site maps. I have proven that a proper link strategy is better at getting your site listed quicker than using sitemaps.

  6. Nice writeup, Ann!

    @john, Blogger automatically submits Sitemap files (they use the RSS feed) through the robots.txt for each blog so don’t have to do that yourself.

    Regarding “Rumor has it, after adding a sitemap to Webmaster Tools, you might see more pages in index but much less traffic.” — if you believe that rumor (I don’t), you can submit your Sitemap file without using Webmaster Tools, either by listing it in your robots.txt or by using the HTTP “ping” method. However, I really would recommend using Webmaster Tools, you never know when you’ll need the information provided there (say when your site was hacked …).

  7. Good stuff Ann.

    I’ve used three approaches in regards to sitemaps on websites… strangely enough, I’ve gotten the best results from tactic #3:

    1) not bothering to create a sitemap at all

    2) toeing the Google line and creating it exactly to their specs

    3) creating a manually-updated chunk of helpful links and sticking them in a div on the content pages themselves… and calling it a sitemap

    Now I’d say that my results are a function of the kinds of sites I’ve built (regional businesses with fairly small sites), and that this isn’t in any way conclusive evidence for SEOs… but it’s interesting nonetheless.

    What I’ve gleaned from this tactic is that it has improved internal link architecture, and also seemed to boost my chances of picking up supplementals in the SERP listing (the technique places links on every content page that have high “user appeal” and validate the business… I’m convinced that Google spiders are trained to enjoy the taste of these things… like “Contact Us”, “Department Information”, “Map & Directions”, etc.)

    And from a UI standpoint I’ve always liked this better anyway, as it keeps the user in the content. I’m not a fan of going away from the content to a sitemap, just for the purpose of getting back to the content again.

  8. @ Mitch: funny. It sounds like blackhat seo at first, but when done visibly, it should be no problem.

    I have similar experiences with placing a div with links to the most read pages on every page. The linked pages seem to be scanned more often and are given more priority within all results for the website in the serp’s.

    And, as often happens with seo, this result was unintended. I just wanted a quick list of most read pages, so that every visitor could immediately see what the top pages are. Positive side-effect turns out to be the seo result.

  9. Right Inge, I too did it more for user experience than SEO… but that’s why Googs is great. If you build it [for users], they [robots] will come.

    P.S. I just noticed I misspoke in my previous comment… instead of “boost my chances of picking up supplementals in the SERP listing” it should be referring to picking up breakout link results in the SERP listing.

  10. @Ann: please can you explain me about this Query ” How can We create Sitemap.xml file of more than one Lack links ? ”

    and how to manage this Sitemap for Dynamic or static sites for best outout from search engines ?

    if you could possible please suggest any tool and explain about the perfect ideas of this issues.

    Thanks
    Paul

  11. @jay: A great Dreamweaver extension for generating sitemaps for large static or dynamic sites is Surveyor. It not only generates XML sitemaps, but also HTML maps and has automatic search engine submission and sitemap update reminders. Again, it’s just for Dreamweaver users

    If you have a small site, try xml-sitemaps.com.

  12. What if when I submit my Sitemap and get a Network Unreachable : Robots.txt unreachable or Network Unreachable: Network Unreachable, does that mean that google crawlers will not crawl and index my site anymore??