I am having problems getting the xml-sitemap generator to crawl all my site, let me explain:
I have a site full with business listings divided into separate categories I would like to be indexed. Whitin each category these business listings are shown 10 at a time, and if there is more than 10, it obviously get divided into several pages with a page-navigator at the bottom (like with the html-sitemap xml-sitemaps generates). The problem is that this "page counter" only shows 10 page links at a time, and then a next and last link, which if pressed shows another set of ten page links, holding links to 10 business listings each. You get the picture.
What this seems to do for XML-sitemap is this: Say I in this case have +1000 listings in a category, there are only links generated in the sitemap to the first 100. Now, this problem is not entirely consistant cause I have seen xml-generations taking more links into the sitemap (but still not all), however what I am sure of is that a problem only occurs when there are more than 100 business listings within one category.
Is there a way around this problem? In my case I would really like XML-sitemap be dead serious on prioritizing and making sure of following up every url with the string "limit=10" which is a part of every url when a category is divided into multiple pages. Is there a way to do this kind of "reverse exclude" rule? Or is there anything else you would recommend me to do?