XML Sitemaps Generator

Author Topic: Blocking specific pages/directories from getting crawled  (Read 18408 times)

yacosta

  • Registered Customer
  • Approved member
  • *
  • Posts: 1
Blocking specific pages/directories from getting crawled
« on: July 25, 2007, 05:55:42 PM »
Hi is there a way to block specific pages/directories from being crawled?

XML-Sitemaps Support

  • Administrator
  • Hero Member
  • *****
  • Posts: 10624
Re: Blocking specific pages/directories from getting crawled
« Reply #1 on: July 26, 2007, 12:24:58 AM »
Hello,

yes, you can block them using robots.txt or just define them in "Exclude pages" option in Sitemap generator configuration.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.

 

SMF 2.0.12 | SMF © 2014, Simple Machines
XHTML RSS WAP2