I click this "Continue the interrupted session (2009-05-23 01:35:31, URLs added: 70028, estimated URLs left: 1)"
 
then I click    Run in background
  Click  Do not interrupt the script even after closing the browser window until the crawling is complete

My last finished sitemap was Request date:7 April 2009, 21:42   Processing time:2107.00s
Pages indexed: 5225    Sitemap files:1    Pages size: 146.36Mb
Download: XML sitemap

    Any suggestions?
http://www.WebSuccess4You.biz
Web-Success-Directory
//websuccess4u.blogspot.com
These are improvements I made after I made the initial post:

Maximum pages:    I changed this from 0 to  -1
"0" for unlimited
Maximum depth level: 0
"0" for unlimited
Maximum execution time, seconds: 0
"0" for unlimited
Save the script state, every X seconds:   I changed this from 180 to 90
this option allows to resume crawling operation if it was interrupted. "0" for no saves
Make a delay between requests, X seconds after each N requests:
  I changed these to    1s after each  1 requests
This option allows to reduce the load on your webserver. "0" for no delay
http://www.WebSuccess4You.biz
Web-Success-Directory
//websuccess4u.blogspot.com
Hello,

please PM me your generator URL/login to check that.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.
   Maybe you can give me a little information like "Try updating to the newest version of Xml-Sitemap" before I give you the Uername & password to my site.
     I ran it again withe changes and this is what I got"
"Run in background
Do not interrupt the script even after closing the browser window until the crawling is complete
Resume last session
Continue the interrupted session (2009-05-30 19:53:31, URLs added: 70088, estimated URLs left: 1)
Click button below to start crawl manually: "

     ;) I did for the 7th time and this is all I got.
http://www.WebSuccess4You.biz
Web-Success-Directory
//websuccess4u.blogspot.com
Please try to increase memory_limit and max_execution_time in PHP configuration.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.
   The  parameters that you told me to use on your software do NOT exist.
So this is what I did:   

Crawler Limitations, Finetune (optional)
Maximum pages:  0     "0" for unlimited
Maximum depth level:    0    "0" for unlimited
Maximum execution time, seconds:    0     "0" for unlimited
Save the script state, every X seconds:     20
this option allows to resume crawling operation if it was interrupted. "0" for no saves
Make a delay between requests, X seconds after each N requests:    0
s after each requests
This option allows to reduce the load on your webserver. "0" for no delay

And on the Crawling page, this is what I did:
Run in background          X   checked
Do not interrupt the script even after closing the browser window until the crawling is complete
Resume last session        X    checked
Continue the interrupted session (2009-05-30 19:53:31, URLs added: 70088, estimated URLs left: 1)
Click button below to start crawl manually:

     Please respond. Thank you.

http://www.WebSuccess4You.biz
Web-Success-Directory
//websuccess4u.blogspot.com
This is what is happening when I run it at those above parameters:

Links depth: 64725
Current page: GotLinks/Link-Partners.php?Cnm=SEO+Services&ctgid=49&p=64725
Pages added to sitemap: 70128
Pages scanned: 71720 (1,134,894.4 Kb)
Pages left: 1 (+ 0 queued for the next depth level)
Time passed: 1024:58
Time left: 0:00
Memory usage: 64,292.5 Kb
Resuming the last session (last updated: 2009-05-30 19:53:30)
http://www.WebSuccess4You.biz
Web-Success-Directory
//websuccess4u.blogspot.com
That parameter should be changed in PHP configuratin on the server, not in sitemap generator configuration.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.