• Welcome to Sitemap Generator Forum.
 

Crawling for other site

Started by ngftlaudhosp, June 17, 2011, 01:46:35 PM

Previous topic - Next topic

ngftlaudhosp

Can I install the sitemap generator at another account?

Its an addon domain. So im wondering if we are allowed to use the same generator program in two locations, two sites.
is crawled each day with . Thanks for helping our site reach the search engines with your amazing sitemap generator.

XML-Sitemaps Support

Hello,

multiple copies of the Software may be installed and used, but for internal use only, for your personal websites. It cannot be used to create sitemaps for 3rd party sites or provided a service to others (paid or free).
https://www.xml-sitemaps.com/license.html

ngftlaudhosp

Thank you. I need support on something that happened today, the whole forums directory was getting a 500 internal server error. My host said:

----START MESSAGE----

I do see a slight spike in memory usage between 8 AM - 11 AM today, but it wasn't very large. It did drop down at 11:20 AM which seems to coincide with the processes being killed, but unfortunately the logs are not conclusive enough to show what was causing the problem and without the processes running it is difficult to say what was causing the issue.

From what I can see it's likely these two files:

/home/ab2168/public_html/generator/runcrawl.php
/home/ab2168/public_html/generator/index.php

----END MESSAGE----

Is there anything I can do so that it is less intensive on the server?

Ive deleted all the logs it stores in generator/data is there anything else I can do?
Thanks.
is crawled each day with . Thanks for helping our site reach the search engines with your amazing sitemap generator.

XML-Sitemaps Support

Hello,

you can use "Make delay for X seconds after each X request" - that tells generator's crawler to "sleep" for some time between requests to slow down the crawling speed and reduce the server load as a result.

ngftlaudhosp

Quote from: XML-Sitemaps Support on June 23, 2011, 12:31:32 PM
Hello,

you can use "Make delay for X seconds after each X request" - that tells generator's crawler to "sleep" for some time between requests to slow down the crawling speed and reduce the server load as a result.

Thanks, you can close this thread now if you like.
is crawled each day with . Thanks for helping our site reach the search engines with your amazing sitemap generator.