Even after i wrote a code to exclude urls, genarator says :
Pages scanned: 12100 (301,823.7 Kb)
Pages left: 88474
But the amount of pages for indexing is maximum 15 000
So the first question is does generator fallow changes in configuration that had benn done after the crawling was started ?
Does generator look for robots.txt and make any exclusions itself?