XML Sitemaps Generator

Author Topic: Help getting crawler to parse/add files in sub folders.  (Read 18250 times)

knjmarketing

  • Registered Customer
  • Approved member
  • *
  • Posts: 1
Help getting crawler to parse/add files in sub folders.
« on: August 25, 2007, 03:55:00 PM »
Hi gang,

I've just successfully installed the PHP sitemap generator and did my first crawl, but there's a serious problem.

I setup my site with multiple sub folders (Linked to sub-domains) for each niche.

For example:

[external links are visible to admins only]
[external links are visible to admins only]
[external links are visible to admins only]

When running the craawler, it's only listing the 6 index files found in:
[external links are visible to admins only]

It's not spidering:

[external links are visible to admins only]
[external links are visible to admins only]
[external links are visible to admins only]

The sitemap.xml file is in the public_html folder.

How do I get the crawler to parse the 6 .php pages themselves to reach the other 50+ niche folders containing the other files to list in the [external links are visible to admins only] file?

None of the files are .html, everything is dynamicly generated .php files. Does the script have a problem parsing .php files? Please assist.

XML-Sitemaps Support

  • Administrator
  • Hero Member
  • *****
  • Posts: 10622
Re: Help getting crawler to parse/add files in sub folders.
« Reply #1 on: August 26, 2007, 02:48:06 PM »
Hello,

according to the sitemaps protocol it is not allowed to include URLs from different (sub)domains in a single sitemap. You will have to create separate sitemaps for every subdomain.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.

 

SMF 2.0.12 | SMF © 2014, Simple Machines
XHTML RSS WAP2