Hello,
my site has around 3000+ pages.
When we launch the crawling script, it stops at 150-200 pages.
We have tried to increase memory limit.
Webhoster says its limited to 70M - is this not enough ?
We have tried to increase max_execution_time settings
Webhoster says CPU execution is limited to 40seconds
We also tried CRON JOB
Webhoster says 1 CRON JOB is limited to 6 minutes
The webhoster mentioned that if i need more than the limits above they would make me an offer for a managed server. Now, that would mean, in order to run the XML sitemap script i would have to pay 100$ per month more for webhosting - that doesn't make sense, am sure there is a way to solve this.
Can you advise of what we could do to have the crawling run without interruptions ?
I thought of having the Cron Job activate the crawling script every week, when crawler is activated it should simply complete its job. Any special settings maybe that can make it work smoothly ?
Your answers to solve this will be highly appreciated, hope to hear from you soon, bye bye.