Hi
I am trying to run this software, but after running for an hour I recieved this error:
Fatal error: Allowed memory size of 268435456 bytes exhausted
At the point I get this error, it is clear from the status reported that there are many many URLs yet to be accessed:
Links depth: 4
Current page: teams/wolverhampton-wanderers/tab/matches/season/1980
Pages added to sitemap: 11939
Pages scanned: 14720 (471,401.7 KB)
Pages left: 36051 (+ 97430 queued for the next depth level)
Time passed: 1:02:13
Time left: 2:32:23
Memory usage: 137,799.7 Kb
I can see from other posts that many users have encountered this problem, and the advice is usually to up the memory limits. However, given that I have run out of memory with only a small proportion of our large site crawled, I suspect there will not be enough memory on the server to run to completion!
One reason I think this is that despite having crawled over 10000 links, my sitemap.xml file is still empty (perms seem to be correctly set on all files and folders) - implying that the software intends to store many more URLs in memory before committing them to file (although this does seem strange so perhaps I am missing the point...).
Has anyone with a very large site had similar problems, and if so, how were you able to solve them please?
Any comments from xml-sitemaps support would also be very welcome - thanks!
Thanks, Mike