55MB to crawl 9000 urls?
« on: October 28, 2008, 11:23:03 PM »
I was wondering if I am doing somthing wrong MY server is configured to 128mb but script says I used 55MB

Any suggestions?

Thanks
Re: 55MB to crawl 9000 urls?
« Reply #1 on: October 29, 2008, 11:36:51 PM »
Hello,

what exactly is the issue you get? Do you get an error message?
Re: 55MB to crawl 9000 urls?
« Reply #2 on: October 30, 2008, 12:18:29 AM »
Gee, can't believe I didn't include the error.  The script would error due to not enugh memory. I have upped to my maximum memory allowed that is 128MB. I am running the script right now and am at 110MB and only 24000 of about 300 000 pages in que are in the site map so far. So I don't think I'll have it all crawled as the memory will be exusted agian. On this run I have the script set to pause for 90 sec every 1000 pages. That did help get it that far.

The reason for the post was the lst time I ran the script I used 55MB or memory for only 9000 pages. Seems like a lot of memory. I have tired using both crawl methods listed at the bottom of the config page.

Thanks
Re: 55MB to crawl 9000 urls?
« Reply #3 on: October 30, 2008, 12:30:14 AM »
this is the error I go after my last post

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 868240 bytes) in /home/cw4935/public_html/generator/pages/class.grab.inc.php(2) : eval()'d code on line 312

26 000 pages added to the sitemap and all memory exausted.

I'll try making the pause time longer between requests.

Thanks
Re: 55MB to crawl 9000 urls?
« Reply #4 on: October 31, 2008, 12:09:36 AM »
Please PM me your generator URL/login if you will not be able to sort it.