Google, Yahoo! and Microsoft have joined forces to support Sitemaps 0.90 (www.sitemaps.org), a free and easy way for webmasters to notify search engines about their websites and be indexed more comprehensively and efficiently, resulting in better representation in search indices. No news as to why Ask.com is not part of the equation, or whether they plan to be.
For users, Sitemaps enables higher quality, fresher search results. An initiative initially driven by Yahoo! and Google, Sitemaps builds upon the pioneering Sitemaps 0.84, released by Google in June of 2005, which is now being adopted by Yahoo! and Microsoft to offer a single protocol to enhance Web crawling efforts.
How Sitemaps Work
A Sitemap is an XML file that can be made available on a website and acts as a marker for search engines to crawl certain pages. It is an easy way for webmasters to make their sites more search engine friendly. It does this by conveniently allowing webmasters to list all of their URLs along with optional metadata, such as the last time the page changed, to improve how search engines crawl and index their websites.
Sitemaps enhance the current model of Web crawling by allowing webmasters to list all their Web pages to improve comprehensiveness, notify search engines of changes or new pages to help freshness, and identify unchanged pages to prevent unnecessary crawling and save bandwidth. Webmasters can now universally submit their content in a uniform manner. Any webmaster can submit their Sitemap to any search engine which has adopted the protocol.
The Sitemaps protocol used by Google has been widely adopted by many Web properties, including sites from the Wikimedia Foundation and the New York Times Company. Any company that manages dynamic content and a lot of web pages can benefit from Sitemaps. For example, if a company that utilizes a content management system (CMS) to deliver custom web content – (i.e., pricing, availability and promotional offers) – to thousands of URLs places a Sitemap file on its web servers, search engine crawlers will be able discover what pages are present and which have recently changed and to crawl them accordingly. By using Sitemaps, new links can reach search engine users more rapidly by informing search engine “spiders” and helping them to crawl more pages and discover new content faster. This can also drive online traffic and make search engine marketing more effective by delivering better results to users.
For companies looking to improve user experience while keeping costs low, Sitemaps also helps make more efficient use of bandwidth. Sitemaps can help search engines find a company’s newest content more efficiently and avoid the need to revisit unchanged pages. Sitemaps can list what is new on a site and quickly guide crawlers to that new content.
The protocol will be available at sitemaps.org, and the companies plan to have Yahoo Small Business host the site. Any site owner can create and upload an XML Sitemap and submit the URL of the file to participating search engines.
Announcements from Google, Yahoo and Microsoft:
Ken Moss, General Manager Live Search :
So, why are we excited to work on this? Because by agreeing on a standard, we can provide site owners with one simple way to share information with every search engine. You just publish a sitemap, and every engine is instantly able to read and use the data to more effectively index your site. Since this is a free, widely supported protocol, our hope is that this will foster an even broader community of developers building support for it.
We are 100% behind this protocol – this kind of collaboration will help improve the search experience for all of our customers, and we are working hard to release full support in 2007. We are starting to alpha test with internal partners such as MSDN and Microsoft Support now. Like all teams at Microsoft, we like to dogfood our work internally to ensure that it is working properly before it is publicly released. Watch this space for an update as soon as we’re done.
Priyank Garg, Yahoo Search Product Manager :
By offering an open standard for web sites, webmasters can use a single format to create a catalog of their site URLs and to notify changes to the major search engines. This should make is easier for web sites to provide search engines with content and metadata. And in turn, search engines can spend less time crawling unchanged pages and can update indexes faster as new content is discovered. This will help us reflect the changes more quickly, and improve our ability to provide more timely and relevant search results for users. Sitemaps is available to any site owner who wishes to communicate more easily with participating search engines. Simply create and upload an XML Sitemap and submit the URL of the file to search engines.
You can submit Sitemaps to Yahoo! Search through Site Explorer, just like you could add RSS feeds up to now. Just add the site to which the feed belongs, to your list of sites, and then add the feed for that site. We will retrieve the sitemap and use the data you provide us.
As part of this development, we’re moving the protocol to a new namespace, www.sitemaps.org, and raising the version number to 0.9. The sponsoring companies will continue to collaborate on the protocol and publish enhancements on the jointly-maintained site sitemaps.org.
If you’ve already submitted a Sitemap to Google using the previous namespace and version number, we’ll continue to accept it. If you haven’t submitted a Sitemap before, check out the documentation on www.sitemaps.org for information on creating one. You can submit your Sitemap file to Google using Google webmaster tools. See the documentation that Yahoo! and Microsoft provide for information about submitting to them.
If any website owners, tool writers, or webserver developers haven’t gotten around to implementing Sitemaps yet, thinking this was just a crazy Google experiment, we hope this joint announcement shows that the industry is heading in this direction. The more Sitemaps eventually cover the entire web, the more we can revolutionize the way web crawlers interact with websites. In our view, the experiment is still underway.
Subscribe to SEJ
Get our weekly newsletter from SEJ's Founder Loren Baker about the latest news in the industry!