Crawler Hangs
« on: December 18, 2013, 10:45:37 PM »
Crawler hangs and eventually times out.  I have increased the memory_limit setting in php.ini to 256M and still hangs as seen below:

Links depth: 4
 Current page: results.php?county=Hardin+County&r_id=30819&r_type=Bankruptcy+%2F+Bankruptcies&state=Illinois&search=
 Pages added to sitemap: 9334
 Pages scanned: 9832 (2,505,291.7 KB)
 Pages left: 47730 (+ 75 queued for the next depth level)
 Time passed: 1:29:51
 Time left: 7:16:12
 Memory usage: 58,604.4 Kb

Any ideas?
Re: Crawler Hangs
« Reply #1 on: December 19, 2013, 01:58:08 PM »
Hello,

it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.
 
Re: Crawler Hangs
« Reply #2 on: December 19, 2013, 02:42:39 PM »
I increased memory_limit and max_execution_time settings in php configuration from 30 to 1000.  Crawler still hangs after >25,000 lines.
Re: Crawler Hangs
« Reply #3 on: December 23, 2013, 01:50:21 PM »
Hello,

in this case I would recommend to run generator in command line if you have ssh access to your server.