• Welcome to Sitemap Generator Forum.
 

Sitemap generation always stops. and Crawling page broke

Started by ed11, February 09, 2009, 01:54:02 AM

Previous topic - Next topic

ed11

Everyting is setup fine but the crawler always stops after 10 minutes or so.

Now my Crawling tab only displays the Run in background checkbox.

Config seems fine and it does crawl for a while. I always resume but it does not work for very long.

The site i'm trying to crawl has 50-75K pages so.....

I also get this occasionally:
Internal Server Error
The server encountered an internal error or misconfiguration and was unable to complete your request.


Ideas?

Ed


XML-Sitemaps Support

Hello,

it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.

ed11

What do you recommend for the settings. My php max_execution_time is set to 30 and the memory limit is set to 64M

thanks for your help. I will have to put in a support ticket to have this changed.

Ed


ed11

I have made the changes and it still runs for less than 5 minutes.

Also, I host at [ External links are visible to forum administrators only ] if that helps any.

Ed

XML-Sitemaps Support

Are you sure the new settings were applied? Try to create file named phpinfo.php in generator folder with:
<?php
phpinfo
();
?>
and open it in browser to see your PHP settings.

ed11

Interesting. It is set in my php.ini QuickConfig but when I check with phpinfo.php, it is not set there.

I've opened a ticket with HostGator to fix this for me.

I'll follow up when it is working.

Ed


ed11

It still stops. Seems like it runs longer. The crawling tab doesn't fully load anymore. I attached a screen shot. I now can't do anything. I checked my running processes on HostGator and it is not running.


ed11


vera

Oleg, I have the same problem, can you help me it. I have followed the instruction written, but it is still the same.

Thank you!


alan9

Hi,

Was there any resolution to this, as I'm now suffering from the same issue?  My crawl will last 10 mins and then just stop - resuming does nothing.

My host server settings I'm unable to change - however on a test site with the max_memory and max_execution_time limits set lower than my ISP, it completes satisfactorily!

Any ideas?