I have urls that have underscores that I would like to exclude. Lets call it url_page1.asp?blah-blah. There is also a url_page2.asp?blah-blah that I don't want to exclude. if I put url_page1 or url_page2 in the excluse url box, it seems to exclude all url_page*'s and when I remove it altogether, it crawls even fewer pages. ??? It also won't crawl to the link depth I've specified but if I put one of the pages it's missing as the starting point, it crawls all links past that like it should. I'm so confused I'm having trouble asking questions clearly.
Re: Link depth not reached exclude urls seem to be working backwards
« Reply #1 on: May 02, 2012, 11:40:35 AM »

please let me know your generator URL/login and example URLs in private message to check this.