I have a large site.   Support ran the script for me once, from SSH and it was done in two days.  
Now, I have more pages, and I ran the crawler about 3 weeks ago, and it just runs and runs, but never seems to finish.   I look at my web stats and see that it is constantly looking at pages.   But it just does not seem to finish.
When I click on the Crawling Tab, it takes forever, and then just says it is crawling.   What should I do?
Should I stop the current crawl and restart it from SSH?   If so, I have two questions...
1.  How do I stop the current crwal, without it overwriting my last sitemaps?
2.   How (In detail please) do I run the script in SSH?
Thank you in advance for your help.