XML Sitemaps Generator

Author Topic: Crawling fails if more than 140 pages  (Read 25128 times)

mrogers

  • Registered Customer
  • Approved member
  • *
  • Posts: 2
Crawling fails if more than 140 pages
« on: June 30, 2006, 11:28:00 AM »
Hello,

I've just purchased this program and installed it it all seems to work fine if I limit the number of pages to 140.

If I put any higher number than 140 I get a page not found screen after the program has been crawling for a few seconds and the sitemap is not generated.

I have tried both background and foreground methods and using save and resume but I cannot map more than 140 pages.

Using the online method the program drops out when hits the maximum 500 pages as it should.

Can you help me please.

XML-Sitemaps Support

  • Administrator
  • Hero Member
  • *****
  • Posts: 10621
Re: Crawling fails if more than 140 pages
« Reply #1 on: July 01, 2006, 12:28:14 AM »
Hello,

most likely, it means that your server configuration limits maximum execution time for the scripts and it is not enough time to complete sitemap generation.
You should increase max_execution_time setting in php.ini file or execute crawler from SSH command line.
More details here: http://www.xml-sitemaps.com/forum/index.php/topic,124.html
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.

 

SMF 2.0.12 | SMF © 2014, Simple Machines
XHTML RSS WAP2