Hi,

I just purchased your software and the first couple of times in ran OK, indexing/crawling about 40,000 pages. 

I made some changes to my site where more pages can be crawled and now I get this error.

Please help !!!

Error below:


Links depth: 3
Current page: public/user_details.php?account=overview&id=2545&uid=&mininav=YES
Pages added to sitemap: 19392
Pages scanned: 19400 (541,403.9 KB)
Pages left: 24940 (+ 14250 queued for the next depth level)
Time passed: 0:47:23
Time left: 1:00:55
Memory usage: 66,812.7 Kb
Auto-restart monitoring: Fri Jul 22 2011 13:03:08 GMT-0400 (Eastern Daylight Time) (48 second(s) since last update)
Resuming the last session (last updated: 2011-07-22 15:39:56)
Fatal error: Out of memory (allocated 131596288) (tried to allocate 19923043 bytes) in /nfs/c07/h04/mnt/108344/domains/styleapple.com/html/generator/pages/class.utils.inc.php(2) : eval()'d code on line 27
Hello,

it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.
Hi Oleg,

I'm on a Shared Server and cannot increase the limits beyond the following:
My host limits for max_execution_time is 120 seconds
My host limits for memory_limit is 100M

I've ran the sitemap with these limits before and it completed about 40,000 pages.

now it just stops at 19000 pages with the out of memory error. 

Please advise how I can get around this problem.

I really need to get this to work

thank you
Calvin
Hi Oleg,

I bought your Unlimited Sitemap version so I can crawl my entire site.  Are you saying that I will not be able to do this with your software?

Please advise

Calvin
I just check my url limit and it's always been at 40,000 url.

all of a sudden the script stops working?

Hi Oleg,

Just sent it.

Thanks
Calvin
Hi Oleg,

Thank you - I see that you were able to successfully create a sitemap, but it's only for a link depth level of 3.  Is that enough to crawl/index my entire website - I'm not sure what a linked depth of 3 means?   

I did notice a problem - all my images on my site are not showing up in my images.xml file - This new file only shows 2000 images  where I know I have over 14,000 images as shown on my previous images.xml sitemap when looking at my changelog.

Please help as my images are a very important part of my website.

Thanks
Calvin
Hello,

limiting the depth level means that it only includes the pages 3 clicks away from homepage (and images located on last depth level are not indexed since those pages are added in sitemap without fetching from server). In this way generator was able to index around 40000+ pages with the memory limits applied on your server.
Hello Oleg,
Thank you for all the help you've been giving me.  I still am a little confused:

I'm not sure what you meant by the following:

Quote
  (and images located on last depth level are not indexed since those pages are added in sitemap without fetching from server).

What's the difference between indexing my images and fetching them? And
if my image pages are indexed with the regular sitemap why wouldn't they also be indexed by my image sitemap. 

Since my images are so important to my website, please advise how to fix this problem where all my images can get added to my sitemap_images.xml file ?

Thank you

Sincerely,
Calvin
Hello,

when the page is not fetched from server, generator cannot find images on it (since it's needed to parse html content for that).
The ultimate solution is to increase memory_limit to allow generator to index all pages.