• Welcome to Sitemap Generator Forum.
 

Crawler gets stuck?

Started by jon, November 20, 2005, 06:20:44 AM

Previous topic - Next topic

jon

I'm not sure if this is normal but I've tried to run this twice now and each time I get to around 5,000 pages the crawler seems to get stuck.  I've checked back on it several times over the last hour and it's at the same number of entries.  Is there any way to tell if it is stuck and if so, what should I do?

Thanks!


jon

I did that.  Any other suggestions?

XML-Sitemaps Support

If it is still interrupted, you can use "Save the script state, every X seconds" option to resume crawling after it is stopped. In this case after the script is interrupted, you will see a "Resume scan" checkbox on "Crawling" page.

jon

OK, I had that set that but I had it at "3600 seconds" and wasn't getting that far. :)

I have noticed that a couple of the times it got stuck on the same URL.  Is there a way to manually bypass potentially troublesome URLs?

jon

I did bump this down to 600 seconds but I'm not seeing any link to resume the crawl on the crawling page.  That's where I should see it right?

XML-Sitemaps Support

Quote
I have noticed that a couple of the times it got stuck on the same URL.  Is there a way to manually bypass potentially troublesome URLs?
There are special settings on the Configuration page to exclude URLs from being parsed.

QuoteI did bump this down to 600 seconds but I'm not seeing any link to resume the crawl on the crawling page.  That's where I should see it right?
You should try the value of 60 (or even 30 seconds) for this option.

jon

I'm still having problems here.  I've tried excluding URLs but that doesn't seem to help.  Any suggestions for next steps?

XML-Sitemaps Support

You should either change your php configuration (increase max_execution_time value in php.ini) or use "Save state"/resume session option.