Hi, can you explain the program timeline as to what to expect once the program starts to crawl the site.
For example, when does it write to the sitemap.xml? Ongoing or after the program is finished? Does it write anything after stopping a manual crawl?
Scenario: It states that there are 800 pages added. Should that show on the sitemap.xml as it's still crawling? If I stop the crawling, will it write to the file?
How long should it take (range) for a 10,000 link file?
FYI: The trial version worked flawlessly and the pro version seems to have a mind of it's own. If I can understand the process better as to what to expect, it would be helpful.
I can't locate this info on the site but if there, can you provide a link.