Crawling fails if more than 140 pages
« on: June 30, 2006, 12:28:00 PM »

I've just purchased this program and installed it it all seems to work fine if I limit the number of pages to 140.

If I put any higher number than 140 I get a page not found screen after the program has been crawling for a few seconds and the sitemap is not generated.

I have tried both background and foreground methods and using save and resume but I cannot map more than 140 pages.

Using the online method the program drops out when hits the maximum 500 pages as it should.

Can you help me please.
Re: Crawling fails if more than 140 pages
« Reply #1 on: July 01, 2006, 01:28:14 AM »

most likely, it means that your server configuration limits maximum execution time for the scripts and it is not enough time to complete sitemap generation.
You should increase max_execution_time setting in php.ini file or execute crawler from SSH command line.
More details here:,124.html