How can I speed this up.?
« on: September 18, 2017, 02:58:40 PM »
It keeps timing out and having to start from interrupted section.. 

Links depth: 3
Current page: nursing-homes/washington/raymond/
Pages added to sitemap: 65372
Pages scanned: 65414 (2,654,974.4 KB)
Pages left: 156533 (+ 58320 queued for the next depth level)
Time passed: 15:53:38
Time left: 38:02:02
Memory usage: 266,939.9 Kb
Resuming the last session (last updated: 2017-09-18 13:44:36)

Please advise
Re: How can I speed this up.?
« Reply #1 on: September 19, 2017, 07:36:55 AM »
Hello,

with website of this size the best option is to create a limited sitemap - with "Maximum depth" or "Maximume URLs" option limited so that it would gather about 100-200,000 URLs, which would be main pages representing "roadmap" sitemap for search engines.

The crawling time itself depends on the website page generation time mainly, since it crawls the site similar to search engine bots.
For instance, if it it takes 1 second to retrieve every page, then 1000 pages will be crawled in about 16 minutes.