Blocking specific pages/directories from getting crawled
« on: July 25, 2007, 06:55:42 PM »
Hi is there a way to block specific pages/directories from being crawled?
Re: Blocking specific pages/directories from getting crawled
« Reply #1 on: July 26, 2007, 01:24:58 AM »
Hello,

yes, you can block them using robots.txt or just define them in "Exclude pages" option in Sitemap generator configuration.