I have a big website (BSP) needs Sitemap, I tried the Free generator and I really like it coz it helps us on the Google and Yahoo search. I understood it was only 500 pages indexed which is definitely not enough, therefore I applied for this unlimited one.

The progress has been pretty slow and when I went to bed, it seems stopped/ interrupted at depth 3 around 18420 pages. When I woke up and resume it, I got the Fatal error message like this. Can anyone help me please? I need this site map in hurry!! Thanks!!

Links depth: 3
Current page: efa/archives/2009/10/06
Pages added to sitemap: 17240
Pages scanned: 18460 (435,373.5 KB)
Pages left: 28614 (+ 80869 queued for the next depth level)
Time passed: 3:33:56
Time left: 5:31:37
Memory usage: 66,558.5 Kb
Resuming the last session (last updated: 2009-10-18 11:32:28)
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 17053064 bytes) in /xxxxxxxxxxxxxx/generator/pages/class.utils.inc.php(2) : eval()'d code on line 27

I have just changed memory_limit and max_execution_time in the php.ini and now it seems working again.

Still got at least 10 hrs to do, will report if problems found again.
After I have changed the memory_limit, my server got crashed in the peak hour. The generator ate up all the RAM and so the swap memory used and the server loading went to super busy and even no response.

Then I set the maximum pages to be crawled to 40000, it finished in an hour but it seems just not including the important parts.

Now I set to 200K pages and crawl again.

As a BSP, will 200K page be too less? But it takes me so long for generating the sitemap.. I feel panic.
Hello,

with website of this size the best option is to create a limited sitemap - with "Maximum depth" or "Maximume URLs" option limited so that it would gather about 50-100,000 URLs, which would be main pages representing "roadmap" sitemap for search engines.

The crawling time itself depends on the website page generation time mainly, since it crawls the site similar to search engine bots.
For instance, if it it takes 1 second to retrieve every page, then 1000 pages will be crawled in about 16 minutes.

Some of the real-world examples of big db-driven websites:
about 35,000 URLs indexed - 1h 40min total generation time
about 200,000 URLs indexed - 38hours total generation time

With "Max urls" options defined it would be much faster than that.