My websites use the same application engine (IIS) that is placed in the same directory for all websites that have different domain names but generates completely different content provided by the database and slightly different layout. I have a few questions and doubts about generating a sitemap that would be most suitable for my needs. I was hoping to get some advice here.

I'm really excited about starting to use the sitemaps and I'm hoping it will make some kind of an SEO difference.

I've been experimenting creating sitemaps using a crawler / Google Site Map Generator / Creating my own code that generates a sitemap from the database every time it is requested and URL rewriting the sitemap xml so that there won't be any issues when requesting it. .

I'd like to try to use Google Site Map Generator as it has some features that are really interesting and would hopefully result in less work and less maintenance in the long run.

My concerns are:

1. I wasn't able to put two domain names in the site configuration under Host name. That means that all my content on one website will be tracked into the other causing a conflict with the sites in the sitemap.
2. Is there any issues with having the two sites in one sitemap? Is that more likely to cause reasons to trace duplicate content between the sites if there's slight similarities?
3. I wasn't entirely sure about querystrings. If a querystring will show different content on the page than I should allow it to map querystrings? (paging/filters/searches)
4. We have about 300,000 visitors a month. Should I be concerned with any performance issues caused to the server?
5. There's new content for our website all the time. Will that be a problem if I turn automatic Sitemap submission on? How often will it submit than?
6. Every time the Google Site Map Generator reaches the sitemap maximum size it will add the new XML file into the robots.txt?
7. I'm planning to change our URL rewrite rule to lowercase and 301 redirect any uppercase one to the lowercase URL. What would happen with my current sitemap? That means it will start tracking all the new URLs and I should just reduce the maximum age of the URL to a few weeks so that all these URLs will slowly be removed from our sitemap?

Looking forward to some response,

Thank you kindly in advance,
Regards, Aaron.

1,2. According to sitemap protocol, it is only allowed to include URLs form a single domain, so you will have to create 2 sitemaps.

3. Yes, you should allow crawling it.

4. Since sitemap is created from time to time only, it will not affect general server performance.

5. You can setup a scheduled task to create sitemap automatically, say weekly on daily (depending on how actively content changes).

6. It will create multiple sitemap files in case if the limit is exceeded. In robots.txt only asingle entry is required though (sitemap.xml) - the main index file will contain pointers to all other sitemap files.

7. Yes, old URLs will be reindexed.
Hi Oleg,

Thank you for your reply!

So is there a way for me to configure Google Site Map Generator to distinguish between my domains so that it creates different sitemaps for each domain?

Otherwise I'm guessing my only options at the moment are:
1. Use the map that Google Site Map created for me and create a script that separates them into the correct portals every once in a while and the new ones that I create from those will be the ones that I publish.
2. Go back to the original idea of generating my own site maps directly from the database itself.

Regards, Aaron.
So is there a way for me to configure Google Site Map Generator to distinguish between my domains so that it creates different sitemaps for each domain?
Yes, you will have to install 2 separate instances of generator for that.