Crawling is stopping after a while
« on: August 05, 2014, 01:49:58 PM »
Dear XML Team,

I have a problem with the script- I installed it correctly and started the crawling. But everytime its stopped - I waited now a long time, but nothing happens...

Already in progress. Current process state is displayed:
Links depth: 3
Current page: XYZhidden
Pages added to sitemap: 2300
Pages scanned: 2341 (87,000.1 KB)
Pages left: 1949 (+ 683 queued for the next depth level)
Time passed: 0:06:21
Time left: 0:05:17
Memory usage: 3,982.1 Kb
Re: Crawling is stopping after a while
« Reply #1 on: August 05, 2014, 02:25:45 PM »
Hello,

it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.
 
Re: Crawling is stopping after a while
« Reply #2 on: August 06, 2014, 07:01:53 AM »
Hello,

thanks for the advice. I spoke to the server aministrator - he is asking for a logfile, where we can see exactly where the problem is?!
Is there somewhere a logfile?
Re: Crawling is stopping after a while
« Reply #3 on: August 06, 2014, 04:53:56 PM »
Hello,

in case if the issue is related to memory/execution time limits, there would be an entry in web server's error  log (not in generator script log).
Re: Crawling is stopping after a while
« Reply #4 on: August 08, 2014, 03:46:34 AM »
I have a large site (over 700,000 pages) and I'm only able to generate a sitemap.xml of about 558 pages and trying a couple of times. I have a dedicated server. Why does the crawler stop at the same point and not continue???

My max_execution_time = 60
memory_limit = 256M

« Last Edit: August 08, 2014, 03:50:51 AM by mark31 »
Re: Crawling is stopping after a while
« Reply #5 on: August 08, 2014, 06:14:41 PM »
Hello,

please let me know your generator URL/login in private message to check this.