Re: Out of memory problems
« Reply #15 on: June 04, 2012, 09:19:53 PM »
OK - I can see the process, but I can't stop it.

It will tell me that the stop signal has been sent to the crawler, but the crawling tab indicates that crawling is still in progress.  And uploading interrupt.log no longer seems to work.

So I've also no idea if it will stop when the crawl finishes, or if it will set of another crawl automatically (just to be clear, I am happy for it to resume an interrupted crawl, but I only want it to start crawling on demand).

Thanks, Mike