XML Sitemaps Generator

Author Topic: Better handling of robots.txt  (Read 13909 times)


  • Registered Customer
  • Approved member
  • *
  • Posts: 2
Better handling of robots.txt
« on: December 26, 2008, 06:39:30 PM »
It is great that the robots.txt file is used in sitemap generation, however I would love to see it behave more like Googlebot in this case. According to the [external links are visible to admins only], Googlebot supports basic pattern matching in the robots.txt file as well as the use of "Allow:" which I find very useful. I would love to see these included in the script since I use them both in my robots.txt file.

I realize that these are not in the web standards for valid robots.txt files, but many people still use them anyway.

XML-Sitemaps Support

  • Administrator
  • Hero Member
  • *****
  • Posts: 10599
Re: Better handling of robots.txt
« Reply #1 on: December 27, 2008, 02:51:27 PM »

sitemap generator supports only standard robots.txt directives. However, you can disable it manually in config.inc.php file ('xs_robotstxt' setting) and use "Exclude URLs" option instead.
Oleg Ignatiuk
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.


SMF 2.0.12 | SMF © 2014, Simple Machines