• Welcome to Sitemap Generator Forum.
 

Crawling fails if more than 140 pages

Started by mrogers, June 30, 2006, 12:28:00 PM

Previous topic - Next topic

mrogers

Hello,

I've just purchased this program and installed it it all seems to work fine if I limit the number of pages to 140.

If I put any higher number than 140 I get a page not found screen after the program has been crawling for a few seconds and the sitemap is not generated.

I have tried both background and foreground methods and using save and resume but I cannot map more than 140 pages.

Using the online method the program drops out when hits the maximum 500 pages as it should.

Can you help me please.

XML-Sitemaps Support

Hello,

most likely, it means that your server configuration limits maximum execution time for the scripts and it is not enough time to complete sitemap generation.
You should increase max_execution_time setting in php.ini file or execute crawler from SSH command line.
More details here: https://www.xml-sitemaps.com/forum/index.php/topic,124.0.html