I'm running the script now with a 3600 second "restart" if the script stops.
That seems to keep it alive, but the script is taking up to 30 seconds "per-page" to scan.
The auto restart monitors the time. Since each page scanned causes the script to stop, the auto restart begins the timer after each page has been scanned. The clock gets to about 30 seconds and the next page is scanned...starting the clock over again.
I've removed the cron job, but that seems to have no effect.
Here is the most recent data:
Links depth: 3
Current page: category/draperies-567/10
Pages added to sitemap: 2307
Pages scanned: 2344 (116,076.0 KB)
Pages left: 1216 (+ 3420 queued for the next depth level)
Time passed: 2:00:19
Time left: 1:02:25
Memory usage: 7,022.8 Kb
Resuming the last session (last updated: 2015-04-25 18:29:31)
Auto-restart monitoring: Sat Apr 25 2015 12:01:29 GMT-0700 (Pacific Standard Time) (15 second(s) since last update)
The memory usage continues to climb, which indicates (to me) that this is the "bottleneck". It would seem to me that due to the massive size of my site I need a method to stop the scan after "X" pages, allow the script to write what it has scanned to the sitemap.xml file, clear the memory, then continue with the scan.
Is that possible?