I have noticed numerous questions on this site about conditions where the Crawler stops after a few minutes. A thousand pages or may only a few hundred pages gets indexed. In many of these occasions you tell people to increase the memory limit and increase the script time out limit.
I have told you that my host gave me this statement:
"I have uploaded a php.ini to your account, and within that file, set your memory limit to 64M. I cannot, however, update the script timeout limit, as we have a server-side limit of 60seconds in place, and changing it at the account-level won't override it."
You requested that I send my URL to you as a private message. I did this over a week ago and have never heard back from anyone nor has anyone replied to my questions within this thread.
So, I have to keep restarting the crawl every 5-6 minutes and this has been going on for over two days since it is impossible to keep doing this on a continuous! Now, it appears that the crawl has started a new Sitemap and is now overwriting 2 days of work or it is parsing the sitemap into more than one file. There seems to be no way of determining. Do you have any help for people in this situation? This has become unbearable and I'll have to abandon this effort without some help.