Warning when Crawling
« on: August 14, 2009, 09:57:09 PM »
I've been receiving this Warning message when trying to create a sitemap:
Warning: set_time_limit() [function.set-time-limit]: Cannot set time limit in safe mode in /home2/gottalot/public_html/albumfiend/generator/pages/class.grab.inc.php(2) : eval()'d code on line 8

and then the crawling will completely stop and this new error shows up below the first one:
Fatal error: Maximum execution time of 30 seconds exceeded in /home2/gottalot/public_html/albumfiend/generator/pages/class.grab.inc.php(2) : eval()'d code on line 386

and I've got to click on "Crawling" and then resume. Quite annoying, what's the problem? ??? ???
Re: Warning when Crawling
« Reply #1 on: August 14, 2009, 10:32:40 PM »
Also I just updated to the latest one (v3.0) and I have been running it and still having to resume crawling every 5 minutes because it stops with that error message. But now it's REALLY annoying. It gave me the error and the stats were:
Pages added to sitemap: 6518
Pages scanned: 7360 (228,095.6 KB)
Pages left: 158 (+ 2347 queued for the next depth level)

then I clicked "Crawling" to resume it again and it says:
Continue the interrupted session (2009-08-14 17:26:25, URLs added: 0, estimated URLs left in a queue: 0)

So the half hour I spent trying to get it to crawl was completely wasted. Would somebody please reply to this, preferably the admin
Re: Warning when Crawling
« Reply #2 on: August 15, 2009, 12:14:57 AM »
Hello,

it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.
Re: Warning when Crawling
« Reply #3 on: August 15, 2009, 02:20:07 AM »
Does this also explain why the script runs SO slow on my website? My website always runs very fast, but running this script it is very slow... I have added 12,000 links to sitemap so far today (After MANY MANY MANY stop/resume because the script gets stuck) and it runs so slow it's almost unbearable...
Re: Warning when Crawling
« Reply #4 on: August 15, 2009, 01:41:00 PM »
Hello,

The crawling time itself depends on the website page generation time mainly, since it crawls the site similar to search engine bots.
For instance, if it it takes 1 second to retrieve every page, then 1000 pages will be crawled in about 16 minutes.
Re: Warning when Crawling
« Reply #5 on: August 15, 2009, 09:49:51 PM »
I spoke to my web host, had memory increased to 128MB, I increased the execution time as well and disabled safe mode so I am no longer getting the error messages but this has still not fixed the problem of it freezing up on me for no apparent reason. Right now it is completely stuck at:

Links depth: 18
Current page: album/43620/The-Cure-Wild-Mood-Swings.html
Pages added to sitemap: 13130
Pages scanned: 14520 (418,087.0 KB)
Pages left: 14 (+ 2843 queued for the next depth level)
Time passed: 1:56:31
Time left: 0:00:06
Memory usage: -
Resuming the last session (last updated: 2009-08-15 16:24:18)

I hit Crawling again, try resuming then it loads this screen up again and just sits there. I also try running with the option "Do not interrupt the script even after closing the browser window until the crawling is complete" and it still gets stuck on this page.
Can I provide the admin with the URL to the login page for my Sitemap generator so he can take a look at it first hand? I've spent nearly two days now trying to get a sitemap created  > :-X
Re: Warning when Crawling
« Reply #7 on: August 16, 2009, 09:22:42 PM »
It says I'm not allowed to send private messages?
The URL is [ External links are visible to forum administrators only ]
but I'm not posting the login details here, I need to be able to send PM's  ???