XML Sitemaps Generator

Author Topic: sitemap stops after crawling a few pages  (Read 28578 times)

kairekai

  • Registered Customer
  • Approved member
  • *
  • Posts: 1
sitemap stops after crawling a few pages
« on: June 26, 2006, 11:22:52 PM »
i have 100K pages or so, but sitemap crawler always stops crawling after a few pages - max of about 500 or 1000 pages.  i am always having to stop and re-start from the prior save.

also sometimes during this process, the counter resets even though the crawldump.log file seems updated (i see the size growing).  in such cases, i am having to manually reset to a prior version of save (which I have now learnt to do, after losing weeks).

what could i be doing wrong?  i woudl like this process automated to where I am not having to manually reset after every 500 crawls.

XML-Sitemaps Support

  • Administrator
  • Hero Member
  • *****
  • Posts: 10624
Re: sitemap stops after crawling a few pages
« Reply #1 on: June 27, 2006, 10:10:47 PM »
Hello,

this issue means that your server limits the maximum execution time for php scripts.
You can manually extend this time by modifying your host config: in php.ini file increase the maximum_execution_time setting value and restart Apache after that.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.

 

SMF 2.0.12 | SMF © 2014, Simple Machines
XHTML RSS WAP2