The crawling process stops with no error after 15-20mn on my Joomla large website. I can resume it several time but it keeps stopping after 15-20mn. I only managed to reach 1300 URL instead of more than 200 000.
=> What can I do ?

I tried then a cron job with an online service but I got the following answer:
HTTP/1.1 200 OK
Date: Sun, 02 Jun 2013 15:30:04 GMT
Server: Apache
X-Powered-By: PHP/5.4.15
Transfer-Encoding: chunked
Content-Type: text/html

This tool can be executed in command line mode only


=> Does the cron job script work with online cron job service ?


I'm experiencing this same issue... just bought the stand alone, changed a lot of settings with my host but still no luck :(

Is there a staff member who can see what might be wrong?
The same as 'prtechpro'..It will create a sitemap and then freezes without any error message. I have site whuch contains over 500k pages and i can get 2k index... and this with starting it over a couple of times :(

it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.
memory_limit: 512m
max_execution_time: 660

aint that enough?
My hosts tells me they can do anything you guys suggest but right now the script is set to run for 10 minutes already which is not standard and not as secure as they'd like.

Can you look at my settings? Aint there an other way to use this script? My hosts even ases if they could run a command from command line (CLI).. is that possible?

If i really cant use this then i'd like to request a refund....
Yes, running in command line is recommended in this case. The command line to use can be found on generator Crawling page.