Processing time in about 20-30sec but script interrupted
« on: April 09, 2016, 09:28:13 AM »
Hi,
i'm trying to use the script but the creawler stops his works even if it runs in 20-30 seconds.

I have to continue the crawler session for a couples of times before it ends regularly.

How can i fix it?

Then, i need to create more differents cron files with differents configurations. There is a way to do that?
Thanks a lot.
Re: Processing time in about 20-30sec but script interrupted
« Reply #1 on: April 10, 2016, 07:59:53 AM »
Hello,

it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.
 
Re: Processing time in about 20-30sec but script interrupted
« Reply #2 on: April 11, 2016, 05:17:23 PM »
Thanks for your replay,

Our php server settings, now, are:
memory_limit: 256
max_execution_time: 240.

It's not working, this is the message in the crawler tab:
Code: [Select]
Updated on 2016-04-11 16:11:11 (31 seconds ago) , Time elapsed: 0:00:15,
Pages crawled: 27329 (27329 added in sitemap), Queued: 0, Depth level: 2
Current page: sitemap_it.xml (1.3)

27329 added in sitemap... the script get all the info but it doesn't end properly.