i have 100K pages or so, but sitemap crawler always stops crawling after a few pages - max of about 500 or 1000 pages. i am always having to stop and re-start from the prior save.
also sometimes during this process, the counter resets even though the crawldump.log file seems updated (i see the size growing). in such cases, i am having to manually reset to a prior version of save (which I have now learnt to do, after losing weeks).
what could i be doing wrong? i woudl like this process automated to where I am not having to manually reset after every 500 crawls.