I have 2 sites that i use this on. 1 scripted using microsoft frontpage html only and another that is scripted entirely in php. Both are hosted on godaddy shared hosting. Both sites run fine with the online gui sitemap generator. I have both setup as cron jobs to run nightly. The html site runs fine and generates the sitemap without any errors but the php site spits out this error code:
"/web/cgi-bin/php5: /usr/local/lib/libpng12.so.0: no version information available (required by /web/cgi-bin/php5)
Set-Cookie: PHPSESSID=qgkb77b0k2503bf7btepkhg7g0; path=/
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
<h4>An error occured: </h4> <script> top.location = 'index.php?op=config&errmsg=%3Cb%3EThere+was+an+error+while+retrieving+the+URL+specified%3A%3C%2Fb%3E+http%3A%2F%2Fwww.habitkickers.com%2F%3Cbr%3E%3Cb%3EHTTP+headers+follow%3A%3C%2Fb%3E%3Cbr%3Eserver%3A+squid%2F2.6.STABLE21%3Cbr+%2F%3Edate%3A+Mon%2C+12+Dec+2011+19%3A00%3A03+GMT%3Cbr+%2F%3Econtent-type%3A+text%2Fhtml%3Cbr+%2F%3Econtent-length%3A+1091%3Cbr+%2F%3Eexpires%3A+Mon%2C+12+Dec+2011+19%3A00%3A03+GMT%3Cbr+%2F%3Ex-squid-error%3A+ERR_ACCESS_DENIED+0%3Cbr+%2F%3Ex-cache%3A+MISS+from+p3nlhproxy003.shr.prod.phx3.secureserver.net%3Cbr+%2F%3Ex-cache-lookup%3A+NONE+from+p3nlhproxy003.shr.prod.phx3.secureserver.net%3A3128%3Cbr+%2F%3Econnection%3A+c' </script> "
I have tried every solution provided on this and other forums with no luck. This includes changing the ip address of the crawler, changing the cron job command and i have checked all my directory file permissions and they all are exactly what the installation pdf says. What can i do to fix this issue?
My web url is [external links are visible to admins only]