• Welcome to Sitemap Generator Forum.
 

stuck at 460 pages

Started by nirav.dave, August 06, 2008, 01:31:57 PM

Previous topic - Next topic

nirav.dave

I have downloaded the v2.9 and tried it with one of my domains and its stuck at this point.

Links depth: 1
Current page: 78_Tiruvallur/
Pages added to sitemap: 459
Pages scanned: 460 (28,156.9 KB)
Pages left: 44 (+ 50466 queued for the next depth level)
Time passed: 2:21
Time left: 0:13
Memory usage: 13,521.1 Kb

on the same server i have v2.7 installed for another domain which works smoothly. any help appreciated.

cheers
dave

XML-Sitemaps Support

Hello,

perhaps you have much more pages on this new domain compared to another one. Looks like you should increase max_execution_time and memory_limit settings significantly in PHP configuration on your server (php.ini) to avoid timeout issues.

nirav.dave

this is what i see in when i check [ External links are visible to forum administrators only ]

what do you think i m doing wrong here? the crawling now pauses after every 200-300 urls


max_execution_time 0
max_input_nesting_level 64
max_input_time -1
memory_limit 256M


nirav.dave

i have sent u the PM. Still havent recd any response. help appreciated.


nirav.dave

i apologies, didn't realise i had a PM from you.

sorry that did not work for me. I just tried running the generator again and it stopped at

Links depth: 2
Current page: 12/posts/2_Jobs/152_Sales_Jobs_/
Pages added to sitemap: 4681
Pages scanned: 4960 (190,494.3 Kb)
Pages left: 55170 (+ 18499 queued for the next depth level)
Time passed: 24:02
Time left: 267:22
Memory usage: 34,456.4 Kb
Resuming the last session (last updated: 2008-08-12 15:36:48)

thanks
dave

nirav.dave

hi,

I am still waiting for the reply.

cheers
nirav

XML-Sitemaps Support

Hello,

please try to install the latest version of sitemap generator (2.9) - you have v2.7 running now.

nirav.dave

Hi,

I installed the latest version and things have not changed. The crawler now stops at every 480 pages

max_execution_time = 9000 ; Maximum execution time of each script, in seconds
max_input_time = 9000 ; Maximum amount of time each script may spend parsing request data
memory_limit = 32M ; Maximum amount of memory a script may consume (8MB)

I have sent you a PM with my details

Thanks
dave


nirav.dave

the problem is still the same.

i have increased the memory limit and max execution time, but it still freezes after crawling about 1000-1500 pages.

I have sent u a pm with details

my phpinfo is here
www.meramaal.com/phpinfo.php

nirav

nirav.dave

A faster response will definately help! :)