• Welcome to Sitemap Generator Forum.
 

Sitemap crawl stops constantly

Started by Kinjo187, May 15, 2019, 04:39:35 PM

Previous topic - Next topic

Kinjo187

Hello!  ;)

We have been using your paid sitemap now for about 5 years without issues, suddenly i moved web servers and the sitemap crawl just stops constantly and i have to click resume which carries on for about 30 seconds then stops again even tho it shows it as crawling.

I have now tried every solution i can find on your website and it has made it worse more then anything, i have also changed the max depth level from 150 to 20 and it still does the same thing? below is some information on how i have things setup:

PHP SETTINGS
allow_url_fopen On
file_uploads On
max_execution_time 9000
max_input_time -1
memory_limit 256M
post_max_size 64M
short_open_tag On
upload_max_filesize 64M

Sitemap Settings
Add directly in sitemap (do not parse) URLs:
tag/
feed/

Make a delay between requests, X seconds after each N requests:
1 s after each
1 requests

I added the delay above as i noticed it was pushing my CPU up-to 100%, but no matter what i did it still stopped after about 1 minute

Can you take a look please i can send you the login info if needed.

Thanks.

XML-Sitemaps Support

Hello, 

in this case I would recommend to run generator in command line if you have ssh access to your server.

Kinjo187

Hmm okay. Any chance you could hit me up with a good command line when you get the chance.

Thanks.

XML-Sitemaps Support

Hello,

full command line to be used is displayed on the Crawling page of sitemap generator and looks like:
php /path/to/generator/runcrawl.php

Kinjo187

I thought that was for cron jobs, but great :) thanks for your time.