Firstly on a small site with a few pages it worked perfectly apart from occasionally it put the wrong anchor text for the link in the html site map
but on a larger site with less than 300 pages at the moment it goes incredibly slow, hangs at around page 180-200 and does nothing more, the interrupt request to stop it crawling doesnt ever work properly even after i use it several times and log out and in sometimes it still does not work
it never crawls all the pages i ask it to, to completion and creat anything
why is this whats wrong??
***update***
after following advice found elsewhere on this site i managed to get it to finish crawling but it seems to be listing pages wrong, i set it to put 50 urls per page and i have 260 ish pages that it lists on the sitemap html file
however it does not put 50 urls per page infact i was expecing 6 pages and got seven the last page had a dozen or so links the 6th page had less than 10 and the 5th page similar
i dont think thats 50 per page now is it lol so what could be going wrong??
***update2***
Now i get it to generate site maps and sometimes it finishes sometimes it doesnt but if i set ot to split my pages 60 urls per page as google doesnt like too many urls per page so i dont want to go overboard, it shows i have 253 urls so i think hmm 253 divided by 60 is about 5 pages, however i seemto be getting 3 pages page one has loads of urls page 2 and 3 have alot less on each and both different amounts
certainly not several pages with 60 urls on each, it doesnt matter what value i set it for it always gets it wrong and puts some pages with large amounts and some pages with tiny amounts
i set it once and i got 8 pages 4 of which had less than 15 urls on them the rest had loads
i set it for 100 urls per page and it puts all 253 on one page
this has got to be a major bug hasnt it??