XML Sitemaps Generator

Author Topic: Link depth not reached exclude urls seem to be working backwards  (Read 7424 times)

crxvfr

  • Registered Customer
  • Approved member
  • *
  • Posts: 5
I have urls that have underscores that I would like to exclude. Lets call it url_page1.asp?blah-blah. There is also a url_page2.asp?blah-blah that I don't want to exclude. if I put url_page1 or url_page2 in the excluse url box, it seems to exclude all url_page*'s and when I remove it altogether, it crawls even fewer pages. ??? It also won't crawl to the link depth I've specified but if I put one of the pages it's missing as the starting point, it crawls all links past that like it should. I'm so confused I'm having trouble asking questions clearly.

XML-Sitemaps Support

  • Administrator
  • Hero Member
  • *****
  • Posts: 10621
Re: Link depth not reached exclude urls seem to be working backwards
« Reply #1 on: May 02, 2012, 10:40:35 AM »
Hello,

please let me know your generator URL/login and example URLs in private message to check this.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.

 

SMF 2.0.12 | SMF © 2014, Simple Machines
XHTML RSS WAP2