Crawling problems
« on: June 24, 2006, 04:45:52 PM »
Hi

I have worked before with the standalone sitemap generator. No problem before. However today I tried to generate a sitemap again and it doesnt seem to be crawling my pages. As soon I press the button to crawl it returns back again to the configuration panel where I need to enter the starting url. I see the message please wait generation in progress. A white screen then appears and then it return to the configuration screen.
Any ideas I might be doing wrong?

[ External links are visible to forum administrators only ]

Regards rik
« Last Edit: June 24, 2006, 05:20:58 PM by crazyhorse2221 »
Re: Crawling problems
« Reply #1 on: June 24, 2006, 10:36:23 PM »
Hello Rik,

the problem seems to be related to your host configuration. Most likely, your server doesn't allow local network connections for php scripts, which prevent sitemap generator to crawl your site.
This is usually related to firewall installed at the server. You may want to contact your hosting support regarding this.
Re: Crawling problems
« Reply #2 on: June 25, 2006, 07:50:39 AM »
Thanks. It worked before??? Same host. I will contact them although im a bit sceptic whether this is the problem!!
Re: Crawling problems
« Reply #3 on: June 25, 2006, 09:01:57 AM »
Hi

I just contacted the server administrator and this is the reply i got:

Quote
Hello,

I am sorry but the server does not allow connection unless it it on a normally open port. We have all other ports closed. A normally opened port would be 80 or 110. A port that one of the many services that we run is running on.

Regards,
Victor

[quote/]

Re: Crawling problems
« Reply #4 on: June 26, 2006, 01:17:36 PM »
Hello,

once it worked fine before, there must be some changes in server configuration applied.
Quote
I just contacted the server administrator and this is the reply i got:
Sitemap Generator uses port 80 (http) for network connections only.
You can try to enter *other* site URL (like www.xml-sitemaps.com) in "Initial URL" setting for sitemap generator and try to start crawling. In case if it will work correctly, the server configuration allows to connect to *external* sites via port 80, but denies the same for *local* connections.
Re: Crawling problems
« Reply #5 on: June 26, 2006, 02:55:52 PM »
I have changed the intiral url into  www.xml-sitemaps.com. Its starts crawling hower the screen freezes. I see the message sitemap generation in progress and a white screen. So it must have something to do with the server configuration. I worked before with a earlier release of xml sitemaps with which I didnt have any problems on the same server.
Re: Crawling problems
« Reply #6 on: June 26, 2006, 04:42:17 PM »
Hello,

yes, as I mentioned above, there must be something *changed* in your server configuration once it was working before and now it doesn't.
Re: Crawling problems
« Reply #7 on: June 27, 2006, 06:42:38 PM »
Hi

I got a reply back from my host.

Quote
Greetings,

You have a 'deny from' line in your .htaccess file. This may very well be blocking the spider engine. The ports that your sitemap utilizes are open, and working properly.

Regards,
Pat
[quote/]

I got ride of the htaccess file. Now it does seem to be able to crawl but it doesnt crawl the pages.
Could u do me a favor and have a look at my initial url. I have send you a pm. Much appreciated.