Hi,
Is there a way to resume the previous task of generating the sitemap.xml ?
The software works pretty much well on all my sites, I had some problems with safe_mode and memory limit (I had to increate it to 64MB in php) but overall it works fine on most of my sites.
I have a site of about 40 to 50 million pages, I haven't got yet an error but I am guessing I may (the job is still running on a ssh window), question: Is there a way to resume crawl when/if at some point the scripts locks/stops ? It would be good if there is such a feature.
Besides that, good work.