I've a 'more of less' static site with more than 2 milion unique links and pages (basically a directory with companies).
The googlebot visits my server on a daily basis rereading the sitemaps (which I've gzipped to keep the load down). I reads them one after another.... (one every 10 second -> 1 sitemap index, 2000 sitemaps)
However crawling the pages goes with a lower pace (one every 3 minutes).
The google tools show me that the sitemaps are read correctly indicating in has discovered 2milion+ url's.
Crawling at this pace it will take more than 10 years before the site is indexed !
What can I do to speed up crawling?