Re: Sitemap crawling never stops/ends
« Reply #15 on: February 21, 2008, 12:29:27 AM »
Hello,
We have the standalone sitemap but after start crawling there is no stop button and the sitemap start with 1 page end then never stop but is "hanging".
Stop is only possible with the upload of interrupt.log.
Please can you help us?
Kind regards,
Peter
Re: Sitemap crawling never stops/ends
« Reply #16 on: February 22, 2008, 12:35:26 AM »
Hello,

you can also try to close your browser and reopen it or open generator in another browser (like IE and FF).
Re: Sitemap crawling never stops/ends
« Reply #17 on: August 11, 2008, 05:52:48 PM »
Is there any other simple method, to stop crawling? It doesn´t work with copying interrupt.log to data folder and also changing browser will not help.

It seems to be an endless loop after crawling about only 100 pages.

...
This is my second test of crawling a website for making sitemap. The first test on another site allways stopped after some time and I had to continue manually. But after about 6699 pages now there it is not possible to continue - I gave up.

 ???
Re: Sitemap crawling never stops/ends
« Reply #19 on: August 29, 2008, 08:13:42 PM »
Here it is the answer of my ISP (1and1.com):

"With regard to your query, due to resource limits on our Shared Hosting machines, it is not possible to allocate more than 20M of memory to PHP, although phpinfo() may report a higher number.
You will be unable to increase the memory usage limit with a php.ini file."

In this case what would be an alternative?


All the best,

Ed Torres

Hello,

do you mean that the crawling stops at this point?
If so, your server limits the maximum script execution time and/or memory for the scripts and you modify your php configuration:
increase memory_limit and max_execution_time settings in php.ini and restart apache.

also discussed here: https://www.xml-sitemaps.com/forum/index.php/topic,318.html
Re: Sitemap crawling never stops/ends
« Reply #21 on: September 11, 2008, 05:07:48 PM »
This thing will not end! Now I have 'Pages added to sitemap: 68657'. That's impossible. What's wrong with this thing???

Also, trying to run through SSH nothing happens:


[root@server generator]# /usr/local/bin/php /home/plays/public_html/generator/runcrawl.php
<html>
<head>
<title>XML Sitemaps - Generation</title>
<meta http-equiv="Content-type" content="text/html;charset=iso-8859-15" />
<link rel=stylesheet type="text/css" href="pages/style.css">
</head>
<body>
[root@server generator]#


Now what???
Re: Sitemap crawling never stops/ends
« Reply #22 on: September 11, 2008, 06:24:11 PM »
Quote
This thing will not end! Now I have 'Pages added to sitemap: 68657'. That's impossible.
What exactly is impossible? Do you have less pages on your site? Please PM me your generator URL/login to check the details.
Re: Sitemap crawling never stops/ends
« Reply #23 on: January 27, 2009, 08:28:18 AM »
Hi there

I get a similar error but having indexed 2900 pages, every time I tell the script to continue from where it left off it doesnt crawl any further.

In terms of the solution proposed below, I have my site on share hosting so I cant change time and memory settings - any other methods of doing this without being the server administrator ?

Thanks

Hello Staffan,

the number of pages depends on your site fully, do you mean that there is much less than 25000 URLs at your site?
Did you increased the memory_limit and max_execution_time settings in php.ini ? (the script will not work for unlimited time in case if server limits it).
Re: Sitemap crawling never stops/ends
« Reply #25 on: March 04, 2009, 07:48:44 PM »
i am running into the same issue. I have around 30K urls to browse. script breaks after 2000. It says "Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 2953092 bytes) in /home/waytwoco/public_html/generator/pages/class.utils.inc.php(2) : eval()'d code on line 6
"

I have increased php.ini memory_limit.
Re: Sitemap crawling never stops/ends
« Reply #26 on: March 04, 2009, 11:30:56 PM »
Please try to create phpinfo.php file in generator folder with:
Code: [Select]
<?phpphpinfo();?>
and then open it in browser and check your memory_limit setting value to make sure you have increased it.
Re: Sitemap crawling never stops/ends
« Reply #27 on: March 18, 2009, 02:44:10 PM »
   
Hello,
I encounter a problem with that crawling does. I can not change the php.ini file hosting site is shared!
Have you a solution?
Best regards,
py
Re: Sitemap crawling never stops/ends
« Reply #29 on: June 08, 2009, 06:27:12 AM »
Yes I am having the same problem and took the advice of configuring save every 60 seconds 1000 and now im getting this error and because the script is encoded, cannot do anything about it

Warning: Division by zero in /home/content/my site/html/generator/pages/page-config.inc.php(2) : eval()'d code on line 74