Announced at PubCon in Las Vegas, Google, Microsoft and Yahoo have agreed to come together and accept a standard protocol for submitting web pages to their crawlers via site maps.

Google was the first to develop a site maps program where one could submit a feed to the Google index and not only ensure their pages are crawled but identify any potential problems. Now MSN and Yahoo follow suit. A new site (Sitemaps.org) has been launched that will contain more information on the subject.

From the Official Yahoo Blog: “By offering an open standard for web sites, webmasters can use a single format to create a catalog of their site URLs and to notify changes to the major search engines. This should make is easier for web sites to provide search engines with content and metadata.”

From the Official MSN Blog: “So, why are we excited to work on this? Because by agreeing on a standard, we can provide site owners with one simple way to share information with every search engine. You just publish a sitemap, and every engine is instantly able to read and use the data to more effectively index your site. Since this is a free, widely supported protocol, our hope is that this will foster an even broader community of developers building support for it.”

And then a comment from the Official Google Blog: “If any website owners, tool writers, or webserver developers haven’t gotten around to implementing Sitemaps yet, thinking this was just a crazy Google experiment, we hope this joint announcement shows that the industry is heading in this direction.”

Danny Sullivan who has been pushing search engines to accept common standards for some time now had the following to say, “Overall, I’m thrilled. It took nearly a decade for the search engines to go from unifying around standards for blocking spidering and making page description to agreeing on the nofollow attribute for links in January 2005. A wait of nearly two years for the next unified move is a long time, but far less than 10 and progress that’s very welcomed. I applaud the three search engines for all coming together and look forward to more to come.”

Most webmasters will never have any need to use the Sitemaps protocol mostly because their sites are small and easy to crawl. However many large e-commence sites and informational sites have already been using Google Sitemaps, not only to ensure all their hard to find content is indexed but to identify potential problems with pages. Now they will be able to do the same with MSN and Yahoo.

Other search engines are invited to accept the protocol which leads me to wonder when Ask will step up to the plate.

Although I love the Ask search engine, it is more difficult to get all your content indexed as well as making sure what they do have indexed is fresh and up to date. That may have something to do with the fact that their intention is not to have the biggest index but rather the most reliable search results (at least it was states as such as SES 2005 in San Jose). Still it would nice if sites that are currently indexed could ensure that content is fresh and current rather than stale and out of date.

What’s next in cooperating together? All the engines have recently adopted a standard for a “nofollow attribute” for links and then a standard for opting out of having Open Directory information appear in the organic search results. Possibly the next project will be adopting a common practice for using the robots.txt system of blocking pages? We’ll just have to wit and see.

Share This Post On Social Media
David Wallace

David Wallace, co-founder and CEO of SearchRank, is a recognized expert in the industry of search and social media marketing. Since 1997, David has been involved in developing successful search engine and social media marketing campaigns for large and small businesses.