My robots.txt file is designed to exclude a variety of types of pages from being crawled by various bots.  Is there a way to import my robots.txt into the xmlgenerator, so that it excludes the same types of page?  Specifically, I'd like to exclude those with the following words in the URLs:
/category/
/feed/
/pages/
and any pages with "embed" in the URL.

Thanks!
Hello,

sitemap generator should detect robots.txt directives automatically, please let me know your generator URL/login private message to check this.
I emailed you a private message with my URL/login.

Thanks!
I have the same issue... my robots.txt is not detected properly.
All the disallowed routes are being parsed.
How to properly setup the robots.txt for the sitemap generator.
Where does it have to be available from?
Hello,

please let me know your generator URL/login and example URLs in private message to check this.