The sitemap runs issues a report then appears to run again but starts to loop and causes the site to be blocked due to high load. When looking at the logs below note the times. I have installed a 10 second delay after 100 pages but no improvement. The site only has 800 pages.
We did not appear have this issue until recent servers were changed by our host.
Thanks for your help
Roy

The  Sitemap report details
-------------------------
Request date: 8 January 2012, 16:11
Processing time: 0:25:06s
Pages indexed: 723
Sitemap files: 1
Pages size: 18.03Mb
 
Showing that the first run of the sitemap had completed before the issues.
The Host response is below
"....This is an issue relating to how long the sitemap takes on the processor when executing this. The account is reaching a combined 6,000 seconds in a one hour timeframe. This 6,000 seconds in an hour is possible because the account is requesting items 4-5 times per second in a separate process. From the logs at 16:00->17:00 today

[2012-01-08 16:18:04]: info: [usr/grp]: wickedti/wickedti cmd: /home/wickedti/public_html/product.php php: /usr/local/php52/bin/php
Jan 08 16:18:04 User: wickedti real: 0.17 user: 0.13 sys: 0.20 mem: 3911680 CharsR: 698 CharsW: 0 BytesR: 0 BytesW: 0 SType: httpd
[2012-01-08 16:18:04]: info: [usr/grp]: wickedti/wickedti cmd: /home/wickedti/public_html/product.php php: /usr/local/php52/bin/php
Jan 08 16:18:04 User: wickedti real: 0.18 user: 0.14 sys: 0.10 mem: 3911680 CharsR: 698 CharsW: 0 BytesR: 0 BytesW: 0 SType: httpd
[2012-01-08 16:18:04]: info: [usr/grp]: wickedti/wickedti cmd: /home/wickedti/public_html/product.php php: /usr/local/php52/bin/php
Jan 08 16:18:04 User: wickedti real: 0.17 user: 0.13 sys: 0.20 mem: 3911680 CharsR: 714 CharsW: 0 BytesR: 0 BytesW: 0 SType: httpd
[2012-01-08 16:18:04]: info: [usr/grp]: wickedti/wickedti cmd: /home/wickedti/public_html/product.php php: /usr/local/php52/bin/php
Jan 08 16:18:04 User: wickedti real: 0.19 user: 0.14 sys: 0.10 mem: 3911680 CharsR: 698 CharsW: 0 BytesR: 0 BytesW: 0 SType: httpd
[2012-01-08 16:18:04]: info: [usr/grp]: wickedti/wickedti cmd: /home/wickedti/public_html/product.php php: /usr/local/php52/bin/php
Jan 08 16:18:05 User: wickedti real: 0.18 user: 0.14 sys: 0.20 mem: 3911680 CharsR: 714 CharsW: 0 BytesR: 0 BytesW: 0 SType: httpd

Here we can see that at 16:18:05, product.php was requested 5 times at the same second for this sitemap generation:

31.3.252.155 - - [08/Jan/2012:16:18:03 +0000] "GET /Dot-Mesh-Boyshorts-pr-18353.html HTTP/1.0" 200 24139 "[ External links are visible to forum administrators only ]" "Mozilla/5.0 (comp
atible; XML Sitemaps Generator; https://www.xml-sitemaps.com) Gecko XML-Sitemaps/1.0"
31.3.252.155 - - [08/Jan/2012:16:18:03 +0000] "GET /Eau-Lazur-Briefs-by-Gracya-pr-18219.html HTTP/1.0" 200 26583 "[ External links are visible to forum administrators only ]" "Mozilla/5
.0 (compatible; XML Sitemaps Generator; https://www.xml-sitemaps.com) Gecko XML-Sitemaps/1.0"
31.3.252.155 - - [08/Jan/2012:16:18:03 +0000] "GET /Elektra-Briefs-by-Roza-pr-17969.html HTTP/1.0" 200 26627 "[ External links are visible to forum administrators only ]" "Mozilla/5.0 (
compatible; XML Sitemaps Generator; https://www.xml-sitemaps.com) Gecko XML-Sitemaps/1.0"
31.3.252.155 - - [08/Jan/2012:16:18:04 +0000] "GET /Elena-Short-Briefs-pr-18464.html HTTP/1.0" 200 21509 "[ External links are visible to forum administrators only ]" "Mozilla/5.0 (comp
atible; XML Sitemaps Generator; https://www.xml-sitemaps.com) Gecko XML-Sitemaps/1.0"

We can also see that single processes were allowed to run for over 15 minutes:

Jan 08 16:25:56 User: wickedti real: 945.10 user: 2.89 sys: 0.19 mem: 180860695 CharsR: 283988 CharsW: 693052 BytesR: 548864 BytesW: 0 SType: httpd
Jan 08 16:16:20 User: wickedti real: 603.26 user: 0.30 sys: 0.10 mem: 178560190 CharsR: 154962 CharsW: 0 BytesR: 0 BytesW: 0 SType: httpd

This max setting has been adjusted as these processes caused the entire server load to reach critical levels, resulting in some of the load errors you've seen there. We cannot allow processes to run this long in shared hosting, as other customers require resources as well..."
Hello,

you can try 1 second delay after each 1 request. If you get the crawler finding pages in loop please PM me your generator URL/login.