BUG V1.02: Crawling >> run
« on: July 06, 2005, 11:07:32 AM »
this is a bug in version 1.02
Submitted a new url not running on the same server and it produces the following error

Code: [Select]
Links depth: 1
Current page: index.php
Pages scanned: 1 (16.8 Kb)
Pages left: 27
Time passed: 0:10
Time left: 4:35

Warning: set_time_limit(): Cannot set time limit in safe mode in /home/virtual/site41/fst/var/www/html/cms/google/pages/class.grab.inc.php(2) : eval()'d code(1) : eval()'d code(1) : eval()'d code on line 6

« Last Edit: July 06, 2005, 11:15:57 AM by raramuridesign »
Re: BUG V1.02: Crawling >> run
« Reply #1 on: July 06, 2005, 12:13:40 PM »
could it be php??
what are the minimum settings?
Re: BUG V1.02: Crawling >> run
« Reply #2 on: July 06, 2005, 12:36:12 PM »

Thanks for your input.
This is not a bug, but a warning message only. It means that your php settings do not allow to change the maximum execution time of the script, which is not default PHP setting and is applied usually in some shared hosting environments. This doesn't mean you can't use Sitemap generator - it will still crawl the site and create the sitemap, the only problem is that for large sites (with a lot of pages) it may be not enough time to complete it (that's why the script tries to extend the limit). You should contact your hosting support regarding this if this will be the case.

Re: BUG V1.02: Crawling >> run
« Reply #3 on: July 08, 2005, 08:10:56 AM »
Thanks for the information
I will be lookin into this