In a broad sense, this is only the content that does it. The size and quality of the content has reached Googleβs threshold for "spider as fast as allowed by the site." SO must actively throttle Googlebot; Jeff told Coding Horror that they received over 50,000 requests per day from Google, and that was over a year ago.
If you browse non-news sites from the Alexa 500 site, you will find that almost all of them have Google results, which will be only a few minutes. (for example, enter the site: archive.org on Google and select "Last" on the left menu)
So, there is nothing practical that you can do on your own site to speed up spiders, with the exception of increasing the amount of traffic to your site ...
Colin pickard
source share