Hi. If I start sitemap generator at an html page it follows the links on that page and it crawls all the links on that page, which brings it into the dynamic pages (even though I've included the meta tag <META NAME="robots" content="index,no follow">.
If I start sitemap generator at a directory containing the html pages, it lists the directory, but not the html pages residing inside the directory
(Example: <url>
<loc>[ External links are visible to forum administrators only ]</loc>
<priority>0.5</priority>
<changefreq>weekly</changefreq>
</url>
</urlset>).
Basically, I need a way to crawl several hundred independent html pages, pages that are not linked to each other. Is there a way to do this?