hi -im getting error each time im trying to run the crawler
the crawl take on 3 min. on a website of more the 2000 products


and i dont know what to do ?? what is wrong ?


Re: hi -im getting error each time im trying to run the crawler
« Reply #1 on: July 11, 2017, 10:45:34 PM »
Me, too. I'm wondering whether the xml-sitemap generator generates sitemaps for an https: address. Could that be the problem? The error message says: "Sorry, the page you are looking for is currently unavailable. Please try again later." That's followed by a link to a lengthy error log that I can't make head or tail of. But trying again later doesn't help.
Re: hi -im getting error each time im trying to run the crawler
« Reply #2 on: July 13, 2017, 10:29:53 AM »
Hello,

it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.
 
Re: hi -im getting error each time im trying to run the crawler
« Reply #3 on: July 13, 2017, 08:31:08 PM »
But yesterday you ran a complete sitemap for me (I paid for you to set up the software on my server), presumably on the same server, with no problem. Anyway, here is the entire .php file in my cgi bin. Please let me know what, if anything, needs to be changed:

;
;  Including file: /data/templates/web/user/domain.com/cgi-bin/php.default.ini
;

upload_tmp_dir = /data/tmp
short_open_tag = On
;register_globals = On
safe_mode = Off
upload_max_filesize = 100M
post_max_size = 100M
output_buffering = 1024
mime_magic.magicfile = /usr/share/misc/file/magic.mime
memory_limit = 32M
include_path = .:/usr/share/php:/usr/services/vux/lib/php
disable_functions = shell_exec,passthru,exec,system,pcntl_exec
allow_url_include = 0
allow_url_fopen = 0
extension_dir = /usr/services/vux/php5/lib/php/extensions

;
;  Set up realpath caching to improve performance against NFS filers
;
realpath_cache_size = 128K
realpath_cache_ttl = 600

;
;  Suhosin Configuration
;

; Disable suhosin for now
;extension = suhosin.so
;suhosin.simulation = On
;suhosin.executor.disable_eval = On
;suhosin.executor.eval.blacklist = popen

session.gc_probability = 1
session.gc_divisor = 100
session.gc_maxlifetime = 3600

date.timezone = America/New_York

error_reporting = (E_ALL & ~E_WARNING & ~E_DEPRECATED & ~E_NOTICE & ~E_STRICT)


;
;  Including file: /data/templates/web/user/domain.com/cgi-bin/php.000.ini
;

zend_extension = /usr/services/vux/php5/lib64/php/extensions/ioncube_loader_lin_5.3_real.so
zend_extension = /usr/services/vux/php5/lib64/php/extensions/ZendOptimizer_real.so

extension_dir = /usr/lib64/php5.3/lib/extensions/no-debug-non-zts-20090626/

;
;  User-based Defaults
;

session.save_path=/data/0/1/62/158/1551647/user/1670904/cgi-bin/.php/sessions



;
;  Including file: /data/0/1/62/158/1551647/user/1670904/cgi-bin/php.ini
;

Re: hi -im getting error each time im trying to run the crawler
« Reply #5 on: July 14, 2017, 03:10:08 PM »
So are you saying now that there's no problem with my php file?  Do you have any idea why the process keeps stopping? And should I just keep resuming until the sitemap is completed? And will the sitemap be accurate even though I keep ignoring the error messages?
Re: hi -im getting error each time im trying to run the crawler
« Reply #6 on: July 15, 2017, 08:30:28 AM »
Yes, there is no problem with php files. You need to resume it since php settings to not allow to complete it in one step as described above.