Hello JAP,
Standalone Generator script supports "Resume crawling" feature, but it still requires to have all urls info restored to know which URLs are already indexed and where to get (parse) the list of links. So, it still will require the same amount of memory and will not help to avoid this.
You can also set the "Maximum number of URLs to index" limit to get at least part of your site indexed.
Basically, it is required to have enough memory for the script to allow it to create large sitemap.