This morning I was getting my day started by looking in Google Webmaster Central at the the stats for animalhavenshelter.org - the web site for an animal shelter here in New York. Animal Haven is an organization we're trying to help though they're losing the employee who was acting as their webmaster, so things are slow going at the moment...
What I noticed was a note under 'Crawl Rate' that said
We've detected that Googlebot is limiting the rate at which it crawls pages on your site to ensure it doesn't use too much of your server's resources. If your server can handle additional Googlebot traffic, we recommend that you choose Faster below.
As you might expect based on the note, the 'faster' option was available. The 'faster' option allows you to tell Google that they can go faster than they normally would when crawling your site. I've seen the faster option enabled on brand new sites before where there's no history with Google Webmaster Tools, but it's been a long time since I've seen it on one that's not newly registered with GWT.
Thing is, I can't see any reason why this site would have the 'faster' option enabled. Google hasn't been crawling that fast to start with (average of 13 pages a day, max of 66 pages per day).
And 'faster' wasn't an option because of server performance - the average time to download pages is less than 1 second.
It does seem that googlebot recently downloaded a large file - but everything they downloaded that day was only 6.8 MB, which isn't that much...
So it's a pretty big mystery why Animal Haven gets the 'faster' option, while much bigger and busier websites don't get 'faster' as an option.
By comparison netterimages.com (a busy Medical Illustration site) does not have the 'faster' option enabled and googlebot crawls 6287 pages per day on average.
And netterimages did have a performance problem in the past (which has been solved) - that's something I would expect would limit googlebot's crawl and might enable the 'faster' option.
If a site like Animal Haven gets the 'faster' option, but a site like Netter doesn't, the only thing I can think of is that one of is that one of the "many factors" googlebot uses to determine the crawl rate of a site is whether the site is a non-profit. It could be that they use the resources of non-profits more gently than they use the resources of commercial sites.
Needless to say, I opted for the 'faster' option and will continue to keep an eye on it...