• Welcome to Sitemap Generator Forum.
 

Scripts kills server (Plesk Onyx 17.0.17, CentOS Linux 7.3.1611 (Core), PHP 7.1)

Started by somebody that I used to know, January 31, 2017, 11:02:02 PM

Previous topic - Next topic

somebody that I used to know

Hi,

We have tested it several times after moving to new server. We have deleted all, re-installed and re-configured it again and again, but it does not help.

After start of Sitemaps crawling it does not show progress as before and after about 300 to 400 pages it kills server. All domains and websites go down. You only can restart server via Plesk to make it work again. Afterwards you can go to index.php?op=crawl and then XML Sitemaps Generator shows that script is running and even shows the progress, but does not communicate. So any update of progress must be forced with 30 seconds updates or 60 seconds update:

Already in progress. Current process state is displayed:

Links depth: 2
Current page: XXXXXXXXXXX
Pages added to sitemap: 330
Pages scanned: 412 (26,934.0 KB)
Pages left: 36 (+ 1949 queued for the next depth level)
Time passed: 0:36:11
Time left: 0:03:09
Memory usage: 6,058.0 Kb

We also changed:
memory_limit from 128M to 512M
max_execution_time from 30 to 9000
but this does not change anything.

Server is a strong cloud server and has no issues with anything else.

Do you have any idea? Does XML Sitemaps Generator needs a new update to make it working again on most recent servers? Any help would be appreciated.

Best,
Wolfgang

XML-Sitemaps Support

Hello,

it's possible that the server couldn't handle the number of requests generator crawler is sending. You can try to use "Make delay" setting to slow down crawling and avoid overloading server.

somebody that I used to know

Thank you. I am trying.

How to make current process state displayed again? It is not showing during crawl.
How many requests needs XML Sitemaps Generator?

XML-Sitemaps Support

Hello,

if you are not getting progress updates it must be related to buffering of script output on server side, this can be affected by different server settings, you can find description here:
[ External links are visible to logged in users only ]

The number of requests corresponds to the number of pages on the website.

somebody that I used to know

Thank you very much, Oleg, for your kind support. We will work on your suggestion.

somebody that I used to know

#5
Oleg, We and the hosting company tried everything.

phpinfo is now showing:
zlib.output_compression off off
output_buffering    0  0

The flush-test still shows:
started: 2017.03.16 04:39:18
finished: 2017.03.16 04:39:24
Note: if flush() works in PHP, then the difference between start and end dates should be approx. 0-1 second. If flush() fails (PHP output is buffered), then the difference will be >= 5 seconds.

Flush failed. :-(

Do you have any idea what we can do to make XML Sitemaps Script showing output again?

PS: I've updated right now to new version v7.2 - the same problem afterwards. No output.


somebody that I used to know

Our cloud server has ssh, but I am preferring browser. Thank you and thanks for the new PHP 7 compatibility! Works great. No freezing of server anymore. No delay between crawls necessary!