Hi - I can see some similar scenarios, but not the exact same one. Large site (80,000 locations, excluding images, etc). Have gradually cranked up the memory. Now running it via command line. The server was rebooted, probably early on the second day of crawling and won't resume.
It emits the header, down to the <body> tag and then returns to the command line. To save memory, I have disabled HTML sitemap, ROR, etc (following the install guide recommendations). I have switched to the second memory type - (var export) - *before* I started the run. I have the save script setting at 180 (default).
There is stuff in the data directory, including a non-blank crawl_dump.log, and a zero length interrupt.log, and an non-zero placeholder.txt.
Suggestions for how to make this proceed further would be most welcome. I'd rather avoid another 24 hour delay, if possible
Thanks In Advance!