Hello I seem to be having a problem getting the sitemap generator to crawl all the pages of a particular website that I am responsible for, the site and section in question is: [ External links are visible to forum administrators only ]

i ran the sitemap generator with the following settings:
starting URL: [ External links are visible to forum administrators only ]
sitemap URL: [ External links are visible to forum administrators only ]
Include Only URLs: black-book-values
parse only urls: black-book-values

When i ran this, it created approximately 15000 pages, the ultimate goal was to index the final page of the black book values, for example i want to index the last page for a trade in value, in which case the crawl path would be:
1.  [ External links are visible to forum administrators only ]   (you need to use the static links on the bottom)
2.  [ External links are visible to forum administrators only ]
3.  [ External links are visible to forum administrators only ]
4.  [ External links are visible to forum administrators only ]
5.  [ External links are visible to forum administrators only ] (need to index this page)

however the problem is, is that the crawler doesnt get past step 4.  I'm not sure if the crawler likes the link "Or get the Trade-in Value of my 2009 Dodge Avenger with default selections", or what the problem might be (in theory this is the link to the very last page (URL #5).

This is my first post, so let me know if you require additional information, thanks in advance for any assistance.
Re: Unable to crawl all pages of a certain section using sitemap generator
« Reply #1 on: September 27, 2010, 08:51:01 PM »
Hello,

the page linked in point#5 has the following tag in html source:
<meta name="robots" content="noindex, nofollow" >

it tells not to index the page, so you should remove that tag.