Crawler Stopped again
« on: July 14, 2010, 10:58:18 AM »
Hello,

I still can't run it because of several problems keeps holding on. This is the actually error:

Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 14618772 bytes) in /home/shopde/public_html/generator/pages/class.utils.inc.php(2) : eval()'d code on line 27

My server got a lot of space and I don't know how to solve this matter. Is there a chance to get help here that the sitemap generator begins on working successfully? I spend already hours by hours without any effective results.
Re: Crawler Stopped again
« Reply #2 on: July 16, 2010, 02:31:19 AM »
OK - tried exactly as you adviced - but problems keeps holding on. What's to do now of running as descriped?

Message is now:

Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 63840131 bytes) in /home/preis/public_html/generator/pages/class.utils.inc.php(2) : eval()'d code on line 27
« Last Edit: July 16, 2010, 02:33:39 AM by info1366 »
Re: Crawler Stopped again
« Reply #4 on: July 16, 2010, 04:02:38 PM »
Thanx - but 200000 is not unlimited ( I thought I purchased a version for unlimited entries ) and even juse and setting max options the script stops  >:(

Meanwhile I've bought another sitemap writer. Even it runs on local pc it's much faster, don't stop and does'nt create any errors. In just one night it has created sitemap files with more than 600000 entries.

So I ask myself if I gave my money away for a "unlimited generator" that is probably limited and create problematic funconality at hisself? Sorry - without any atrributes of trouble - but I think so 'cause this generator does'nt work like the promotion told and promised!
Re: Crawler Stopped again
« Reply #5 on: July 16, 2010, 09:17:31 PM »
Hello,

you would need to increase memory_limit to match the number of URLs you want to index.

The crawling time itself depends on the website page generation time mainly, since it crawls the site similar to search engine bots.
For instance, if it it takes 1 second to retrieve every page, then 1000 pages will be crawled in about 16 minutes.

Some of the real-world examples of big db-driven websites:
about 35,000 URLs indexed - 1h 40min total generation time
about 200,000 URLs indexed - 38hours total generation time

With "Max urls" options defined it would be much faster than that.