Sitemap generation hangs - shared hosting environment
« on: December 20, 2008, 05:58:00 PM »
Hi

I'm also having the timeout problem but, because I'm in a shared hosting environment, I can't increase execution times in Apache and I don't have ssh access.

I run a number of sites that are updated often and really wanted to be able to run a cron script to generate the maps. Unless I can fix this problem, I can forget about the cron.
Re: Sitemap generation hangs - shared hosting environment
« Reply #1 on: December 20, 2008, 06:10:10 PM »
Now the script will not go past a certain point, even if I interrupt and resume it:
Links depth: 2
Current page: napp/show/Brochure.htm?id=249
Pages added to sitemap: 279
Pages scanned: 280 (4,416.1 KB)
Pages left: 70 (+ 548 queued for the next depth level)
Time passed: 5:38
Time left: 1:24
Memory usage: -

 >:(
Re: Sitemap generation hangs - shared hosting environment
« Reply #3 on: December 27, 2008, 04:18:55 AM »
Oleg, I appreciate your assistance, but the number of posts that end "Send me your URL" is a concern. Why is no solution posted?
Re: Sitemap generation hangs - shared hosting environment
« Reply #4 on: December 27, 2008, 03:03:11 PM »
The solution in most cases is to setup properly "Exclude URLs" / "Do not parse" / "Memory usage" and other options in sitemap generator configuration, which are usually unique for every website.
Re: Sitemap generation hangs - shared hosting environment
« Reply #5 on: December 28, 2008, 11:50:26 PM »
Thanks for that information, Oleg.

Are there any guidelines covering these issues that we can access?
Re: Sitemap generation hangs - shared hosting environment
« Reply #6 on: December 28, 2008, 11:55:09 PM »
Oleg, did you get my PM with the URL?
Re: Sitemap generation hangs - shared hosting environment
« Reply #7 on: December 29, 2008, 12:08:54 AM »
I have now installed the Generator on a third, smaller, site and it hangs after 40 pages, failing to create the sitemap.
Re: Sitemap generation hangs - shared hosting environment
« Reply #8 on: December 29, 2008, 12:18:08 AM »
Replied to your PM, looks like sitemap has been created successfully:
Request date:
27 December 2008, 14:11
Processing time:
1197.80s
Pages indexed:
610
Re: Sitemap generation hangs - shared hosting environment
« Reply #9 on: December 29, 2008, 10:28:39 PM »
Hi Oleg

Quote
looks like sitemap has been created successfully:
Yes, but I had to restart it several times. It would not work with a cron job.

Also, I have two other sites running the Generator. One needs the same multiple intervention as the original and the other (a much smaller site) will not create a site map at all.

Cheers
Andrew
Re: Sitemap generation hangs - shared hosting environment
« Reply #10 on: December 31, 2008, 12:42:32 AM »
Hello,

it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.
Re: Sitemap generation hangs - shared hosting environment
« Reply #11 on: January 02, 2009, 07:58:49 PM »
Hi Oleg

Thanks for the reply.

Quote
it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.

I'll ask our host and see what happens. ITM, this doesn't explain why the smallest site map will not run past 40 pages.
Generator hangs after 40 pages
« Reply #12 on: January 05, 2009, 06:00:16 AM »
OK, I'll try this one again!

My generator hangs after crawling the same 40 pages. Hello!
Re: Sitemap generation hangs - shared hosting environment
« Reply #14 on: January 06, 2009, 01:51:25 AM »
Quote
did you try the suggestion posted above?

Even though I don't understand why I need to increase the memory and time-outs for an error that's occurring after 40 pages, I actually did add a php.ini file to the generator directory on all three sites and increased both the memory allocation and timeout.

Unfortunately, however, this failed to fix the error, hence my reposting.