User-agent in robots.txt
« on: February 02, 2007, 11:02:19 AM »
Hi,

I want to dissalow my forum directory for the sitemap. Because my forum is to big. So i have set in the robots.txt file the follow.

User-agent: XML Sitemaps Generator 1.0 (https://www.xml-sitemaps.com/)
Disallow: forum/

And have in de config 'xs_robotstxt' => '1',

But he will index the forum directory so i think the user-agent issnt right. Wat most i set ?

Greets
Re: User-agent in robots.txt
« Reply #1 on: February 02, 2007, 10:32:57 PM »
Hello,

sitemap generator is looking for "googlebot" (or "*") user-agent in robots.txt, since it's main purpose is to create google sitemap. You can add "forum/" in "Do not parse" and "Exclude URLs" options to achieve the result you need.
Re: User-agent in robots.txt
« Reply #2 on: February 09, 2007, 03:13:56 PM »
Hello,

sitemap generator is looking for "googlebot" (or "*") user-agent in robots.txt, since it's main purpose is to create google sitemap. You can add "forum/" in "Do not parse" and "Exclude URLs" options to achieve the result you need.

I have to do that in robots.txt ?
Re: User-agent in robots.txt
« Reply #3 on: February 09, 2007, 11:44:38 PM »
"Do not parse" and "Exclude URLs" are defined in sitemap generator configuration (not robots.txt).