• Welcome to Sitemap Generator Forum.
 

Help getting crawler to parse/add files in sub folders.

Started by knjmarketing, August 25, 2007, 04:55:00 PM

Previous topic - Next topic

knjmarketing

Hi gang,

I've just successfully installed the PHP sitemap generator and did my first crawl, but there's a serious problem.

I setup my site with multiple sub folders (Linked to sub-domains) for each niche.

For example:

[ External links are visible to forum administrators only ]
[ External links are visible to forum administrators only ]
[ External links are visible to forum administrators only ]

When running the craawler, it's only listing the 6 index files found in:
[ External links are visible to forum administrators only ]

It's not spidering:

[ External links are visible to forum administrators only ]
[ External links are visible to forum administrators only ]
[ External links are visible to forum administrators only ]

The sitemap.xml file is in the public_html folder.

How do I get the crawler to parse the 6 .php pages themselves to reach the other 50+ niche folders containing the other files to list in the [ External links are visible to forum administrators only ] file?

None of the files are .html, everything is dynamicly generated .php files. Does the script have a problem parsing .php files? Please assist.

XML-Sitemaps Support

Hello,

according to the sitemaps protocol it is not allowed to include URLs from different (sub)domains in a single sitemap. You will have to create separate sitemaps for every subdomain.