Error by cron job at godaddy
« on: July 14, 2011, 09:14:57 AM »
Hello,

by using a cron job to generate a sitemap I get these error:

<h4>An error occured: </h4>
<script>
top.location = 'index.php?op=config&errmsg=%3Cb%3EThere+was+an+error+while+retrieving+the+URL+specified%3A%3C%2Fb%3E+http%3A%2F%2Fwww.thearticletips.com%2F%3Cbr%3E%3Cb%3EHTTP+headers+follow%3A%3C%2Fb%3E%3Cbr%3Eserver%3A+squid%2F2.6.STABLE6%3Cbr+%2F%3Edate%3A+Thu%2C+14+Jul+2011+07%3A40%3A02+GMT%3Cbr+%2F%3Econtent-type%3A+text%2Fhtml%3Cbr+%2F%3Econtent-length%3A+1094%3Cbr+%2F%3Eexpires%3A+Thu%2C+14+Jul+2011+07%3A40%3A02+GMT%3Cbr+%2F%3Ex-squid-error%3A+ERR_ACCESS_DENIED+0%3Cbr+%2F%3Ex-cache%3A+MISS+from+p3nlhproxy003.shr.prod.phx3.secureserver.net%3Cbr+%2F%3Ex-cache-lookup%3A+NONE+from+p3nlhproxy003.shr.prod.phx3.secureserver.net%3A3128%3Cbr+%2F%3Econnection%3A+'
</script>

My current cron-command in godaddy is:
/web/cgi-bin/php5 $HOME/html/generator/runcrawl.php

where $HOME is the godaddy variable for my home directory.

Configured permissions:
sitemap.xml and ror.xml:   666
\data:   777
IP-address is now set in XML-Sitemaps'[advanced settings]

I tried the Cron job setup in XML-Sitemaps' "Crawling"-Tab too. Same result.

Manual crowling in browser works with and without IP-settings fine.

What can cause this error?

Thanks in advance.

Chris
Re: Error by cron job at godaddy
« Reply #1 on: July 14, 2011, 04:38:40 PM »
Looks like generator crawler receives "access denied" error when getting starting URL page. Unless the server is configured to specifically block requests from command-line scripts, you might be able to get it working by setting server IP address setting in generator configuration.