• Welcome to Sitemap Generator Forum.
 

55MB to crawl 9000 urls?

Started by ajerebic, October 28, 2008, 11:23:03 PM

Previous topic - Next topic

ajerebic

I was wondering if I am doing somthing wrong MY server is configured to 128mb but script says I used 55MB

Any suggestions?

Thanks


ajerebic

Gee, can't believe I didn't include the error.  The script would error due to not enugh memory. I have upped to my maximum memory allowed that is 128MB. I am running the script right now and am at 110MB and only 24000 of about 300 000 pages in que are in the site map so far. So I don't think I'll have it all crawled as the memory will be exusted agian. On this run I have the script set to pause for 90 sec every 1000 pages. That did help get it that far.

The reason for the post was the lst time I ran the script I used 55MB or memory for only 9000 pages. Seems like a lot of memory. I have tired using both crawl methods listed at the bottom of the config page.

Thanks

ajerebic

this is the error I go after my last post

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 868240 bytes) in /home/cw4935/public_html/generator/pages/class.grab.inc.php(2) : eval()'d code on line 312

26 000 pages added to the sitemap and all memory exausted.

I'll try making the pause time longer between requests.

Thanks