Unable to complete crawling sessions since 6-23-11
« on: July 09, 2011, 11:49:56 PM »
We've got about 125,000 pages on our website, with the majority existing at links depth of 6. The last time I was able to get the process to complete was on June 23rd. Since then, it simply cannot complete the session beyond 86,940 pages added to the sitemap. See the attached screen shots to see where it is hanging.

Any idea what is going wrong? I really want to get this completed again so I can resubmit to google all the changes we've made recently!

Here are the sitemap details:
Request date:
23 June 2011, 20:32
Processing time:
3:37:38s
Pages indexed:
95347
Sitemap files:
5
Pages size:
1,100.73Mb

Some configuration settings:
25,000 URLs per file, 10 Mb per file

Number of links per page and sort order in HTML sitemap: 1000

Re: Unable to complete crawling sessions since 6-23-11
« Reply #1 on: July 10, 2011, 02:16:36 AM »
Ok, finally solved it!

I changed the configuration settings to 512MB of memory and unlimited timeout. That seemed to do it.