I just purchased the product and in general I like the concept. Except it will not complete a site map. It looks like this situation may be similar to what some other people have posted here, but many of the other posts have said to PM an administrator so I am not exactly sure what the deal is in my case. Here are the details.
I started the script last night and manually had to restart the crawl every couple of minutes. Then I read about setting the crawl to automatically restart after XX seconds of inactivity. I set mine for 10 seconds and that restart seemed to have helped for a period of time.
However, now I seem to be hitting another road block.
Everytime I get to 18,260 pages scanned, I get the following allocated memory size exhausted as shown here (see my follow up question below this message).
Links depth: 3
Current page: ships_store/index.php?p=details&ident=174892&mfc=Harken&sku=418&prod_name=Trigger+Cleat§ionid=5004
Pages added to sitemap: 18247
Pages scanned: 18260 (684,093.5 KB)
Pages left: 17272 (+ 94682 queued for the next depth level)
Time passed: 0:42:55
Time left: 0:40:36
Memory usage: 115,421.0 Kb
Resuming the last session (last updated: 2013-05-12 19:17:32)
Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 208 bytes) in /home/shop/public_html/generator/pages/class.utils.inc.php on line 102
After 10 seconds, the program restarts giving me a slightly lower number of scanned pages and then once it gets up to this number again it bails.
This leads me to believe that there is a memory limitation on my end. I know I should probably change some setting ... but what setting and what should I change it to? What are the implications of changing the setting? Will it negatively impact my overall site performance?
I would love to get some assistance ... even if I have to pay more for it.
Thanks for your help.