• Welcome to Sitemap Generator Forum.
 

User-agent in robots.txt

Started by wouter, February 02, 2007, 11:02:19 AM

Previous topic - Next topic

wouter

Hi,

I want to dissalow my forum directory for the sitemap. Because my forum is to big. So i have set in the robots.txt file the follow.

User-agent: XML Sitemaps Generator 1.0 (https://www.xml-sitemaps.com/)
Disallow: forum/

And have in de config 'xs_robotstxt' => '1',

But he will index the forum directory so i think the user-agent issnt right. Wat most i set ?

Greets

XML-Sitemaps Support

Hello,

sitemap generator is looking for "googlebot" (or "*") user-agent in robots.txt, since it's main purpose is to create google sitemap. You can add "forum/" in "Do not parse" and "Exclude URLs" options to achieve the result you need.

wouter

Quote from: admin on February 02, 2007, 10:32:57 PM
Hello,

sitemap generator is looking for "googlebot" (or "*") user-agent in robots.txt, since it's main purpose is to create google sitemap. You can add "forum/" in "Do not parse" and "Exclude URLs" options to achieve the result you need.

I have to do that in robots.txt ?

XML-Sitemaps Support

"Do not parse" and "Exclude URLs" are defined in sitemap generator configuration (not robots.txt).