No - they are not in the sitemap, but I'm trying to understand the logic... I crawled BEFORE uploading robots.txt to my server and BEFORE entering anything in the 'Exclude URLs' area of 'Configuration'... and still got the same 1,788 pages crawled. While it's true that the sitemap has been generated the way that I want (with these excluded), my question is: How?? (The only reason I even care is because I want to make sure I understand the logic of the script in order to make sure I am doing everything properly).
Also, I cannot seem to find an explanation of ROR.XML - what is this and how/why/under what circumstances do I use it?
Thanks so much