Excelent software. Thanks for designing it.
I have been using V2.1 for a bout 4 months and it works great except for one problem if fails for memory size. That I can live with and have been for my entire useage.
The problem I have now is that I need to use it for another client. So I bought a copy today (V2.2) and installed it properly and it works sort of okay, but it takes an extreamly long time to do its job. So much so that I used the old V2.1 copy I had for another client.
The exact problem is to crawl a site that has 12000+ links takes forever. I set the config "Maximum Pages" = 3600 and it takes over 20 minutes to run and then will fail for memory usage. I then backed it down to "Maximum Pages" = 3500 and it still took over 20 minutes to run but passed and gave me a xml file. The file looks goo and was accepted by Google.
Where I have a real problem is that when I installed V2.1 on the same server for the same site running the same parameters it only takes 3 minutes to run the crawl and produce the same xml file.
Question - Why does V2.2 take so long? Is it a parsing problem? or is it posibly a problem with the code exicution on my server?
Also i have read just about every post on this site to try and get the memory failure figured out but all the suggestions that have been provided are not viable as my server will not let me change the size above 8 MB.
Any help that can be given on any part of this post would be greatly appreciated