Just before leaving work on friday I noticed the changes that google have made to their sitemaps tool you can now get the number of pages Googlebot’s crawled from your site per day, the number of kilobytes of data Googlebot’s downloaded per day, and the average time it took Googlebot to download pages. (data is for the last 90 days)
Sitemaps is a very nice diagnostic tool as you can see how a site is doing – as seen by Google. As I have just launched a new site its nice to see that google is coming around and spidering the site.
You can also adjust the rate at which a site is spidered – which could be very usefull.
Ok so Microsoft and Yahoo can we have somthing similar 🙂