• Welcome to Sitemap Generator Forum.
 

Crawling is stopping after a while

Started by paypal1360, August 05, 2014, 01:49:58 PM

Previous topic - Next topic

paypal1360

Dear XML Team,

I have a problem with the script- I installed it correctly and started the crawling. But everytime its stopped - I waited now a long time, but nothing happens...

Already in progress. Current process state is displayed:
Links depth: 3
Current page: XYZhidden
Pages added to sitemap: 2300
Pages scanned: 2341 (87,000.1 KB)
Pages left: 1949 (+ 683 queued for the next depth level)
Time passed: 0:06:21
Time left: 0:05:17
Memory usage: 3,982.1 Kb

XML-Sitemaps Support

Hello,

it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.
 

paypal1360

Hello,

thanks for the advice. I spoke to the server aministrator - he is asking for a logfile, where we can see exactly where the problem is?!
Is there somewhere a logfile?

XML-Sitemaps Support

Hello,

in case if the issue is related to memory/execution time limits, there would be an entry in web server's error  log (not in generator script log).

mark31

#4
I have a large site (over 700,000 pages) and I'm only able to generate a sitemap.xml of about 558 pages and trying a couple of times. I have a dedicated server. Why does the crawler stop at the same point and not continue???

My max_execution_time = 60
memory_limit = 256M