robots.txt - error
« on: July 14, 2015, 08:46:59 AM »
I had the version 6 installed. In my robots.txt I had the following lines:

User-agent: *
Disallow: /fotos2/
...

/fotos2 is my old gallery-folder, /fotos my new one.

When the sitemap.xml, ror.xml, sitemap_images.xml are generated, /fotos (and of course /fotos2) - folders and subdirectories are not included! I checked my robots.txt by the robots-Tool of google-webmaster and all tets were fine. When I delete the line /fotos2/ all folders and subdirectories of /fotos/ and /fotos2/ are included!

Then I updated from version 6 to 7 (newest version) and I have the same problem.

When I stated /fotos2 in excluded URLs all subdirectories of /fotos2 are included, when I have not the line Disallow: /fotos2/ in robots.txt .

How can I handle this?
Re: robots.txt - error
« Reply #1 on: July 15, 2015, 07:01:19 AM »
Hello,

please let me know your generator URL/login in private message to check this.