Help getting crawler to parse/add files in sub folders.
« on: August 25, 2007, 04:55:00 PM »
Hi gang,

I've just successfully installed the PHP sitemap generator and did my first crawl, but there's a serious problem.

I setup my site with multiple sub folders (Linked to sub-domains) for each niche.

For example:

http://Teddy-Bear.Domain.com
http://Home-Mortgage.Domain.com
http://horse-racing.Domain.com

When running the craawler, it's only listing the 6 index files found in:
http://www.Domain.com

It's not spidering:

http://www.domain.com/teddy-bear
http://www.domain.com/horse-racing
http://www.domain.com/home-mortgage

The sitemap.xml file is in the public_html folder.

How do I get the crawler to parse the 6 .php pages themselves to reach the other 50+ niche folders containing the other files to list in the http://www.domain.com/sitemap.xml file?

None of the files are .html, everything is dynamicly generated .php files. Does the script have a problem parsing .php files? Please assist.
Re: Help getting crawler to parse/add files in sub folders.
« Reply #1 on: August 26, 2007, 03:48:06 PM »
Hello,

according to the sitemaps protocol it is not allowed to include URLs from different (sub)domains in a single sitemap. You will have to create separate sitemaps for every subdomain.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.