Changed site to ssl and now the crawler doesn't work
« on: November 30, 2016, 09:49:51 PM »
I recently changed my site over to ssl/https and with that I am redirecting anything http to the https version.  Site works fine, but I can no longer crawl with xml-sitemaps to generate the update sitemap file.

In the main configuration I have updated the following and kept the rest of my settings the same :

starting url : [ External links are visible to forum administrators only ]
your sitemap url : [ External links are visible to forum administrators only ]

When I go to the crawling page it tells me there is no sitemap found (the old one with the http links is there though) and when I 'run' it redirects to the configuration tab and shows :

An error occured
There was an error while retrieving the URL specified: [ External links are visible to forum administrators only ]
HTTP Code:

HTTP headers:
x_csize: 0

HTTP output:


What is the problem here and how can I fix it?
Re: Changed site to ssl and now the crawler doesn't work
« Reply #2 on: December 01, 2016, 06:59:34 AM »
Tried this already after searching the forum for ideas.  Nothing changed - same error and everything else.
Re: Changed site to ssl and now the crawler doesn't work
« Reply #3 on: December 01, 2016, 07:51:42 PM »
Probably there is a configuration problem - it looks like your server doesn't allow local network connections via port 443 (https) - as a result sitemap generator is not able to crawl the site. This is usually related to firewall installed at the host - could you please contact your hosting support regarding this?
Re: Changed site to ssl and now the crawler doesn't work
« Reply #4 on: December 01, 2016, 09:28:48 PM »
I'm looking at my firewall right now.  When you say local connection what exactly do you mean?  This is behind a NAT so I have internal ips which are mapped to external as far as the public goes.  Not sure if this helps, but I would think one of the following would allow this right?

"internal allow" - source = internal:any - destination = internal:any - for TCP
"web https" - source = any:any - destination = my public ip range:443 - for TCP
"outbound https" - source = internal:any- destination = any:443 - for TCP

I guess I am confused what you want me to look for as far as the firewall goes and what xml-sitemaps needs to work.  Sites are working perfectly fine so I don't see why they are yet this won't.  Sorry, just not understanding what it needs to operate in this sense.
Re: Changed site to ssl and now the crawler doesn't work
« Reply #5 on: December 01, 2016, 11:49:03 PM »
Oleg,

I went ahead and installed the generator on another domain on the same server / same setup - worked fine.  So, I deleted all the generator/sitemap files from the domain I was having problems with and reinstalled.  Works fine now.  I have no idea what the problem could have been, but as of now it seems to work  ;D

I wish I could tell you what the problem was, but I have no idea.  I just reinstalled and put back my normal settings.  Thanks for the help and have a great week!
Re: Changed site to ssl and now the crawler doesn't work
« Reply #7 on: December 02, 2016, 07:12:00 AM »
Same here... I wish I could tell you what is different, but I didn't change anything on the server and pretty sure the sitemap settings are the same as well.

Anyways - thanks again and have a great week!