Oleg, please help me figure this out...
Every time I run a crawl, it stops running after a while. I have the configuration set to save the script state every 30 seconds... this enables me to manually 'Resume' from the point where it stopped. That's not such a big problem when I'm manually running the crawl - However it also stops the crawl when I run my scheduled weekly cron job also, and this is where it's a bigger problem.
I've read many posts about people saying that it takes many hours to crawl but that's not my issue... it definitely is stopping - when it stops, I lose the "Transferring data from..." (lower left corner of firefox browser), and I lose the Status bar (lower right corner of browser).
I also get this error every time the cron job runs (this gets emailed to me by godaddy):
/web/cgi-bin/php5: Symbol `client_errors' has different size in shared object, consider re-linking
Set-Cookie: PHPSESSID=77fpbbomtl4iegge4elaknf5h7; path=/
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
What can I do about this error? What does it mean???
Thanks so much!