I managed to convinced tech support to up the memory_limit for only a day and it works...but...after an hour or so...the script started all over! It says resuming the last session 1970-01-01 00:00:00. I checked crawl_dump.log and its file size greatly reduced. "Page added to sitemap" was over 90,000 then drops down to 8,000. I hope it won't happen again since I only have a day with increased memory_limit.
Also, sitemap.xml file size is still 0...I've changed permission to 0666 before I started the crawling...or is that to be expected until crawling finishes?