Increasing memory
« on: March 28, 2008, 10:13:25 PM »
Hi

I'm been trying for a while to complete a site map but it just keeps on stalling at 1758 I've pretty much changed everything I can from the control panel but what I can't find is where can I increase the memory limit because I think that the reason why it's not working after about 15 tries!!

thanks
Tommy
Re: Increasing memory
« Reply #1 on: March 28, 2008, 11:49:30 PM »
Hello,

memory_limit and max_execution_time settings should be increased in PHP configuraiton on your host (php.ini file on the server).
Re: Increasing memory
« Reply #2 on: March 29, 2008, 12:53:45 AM »
I'm on shared hosting is it possible? or did I buy something that won't work on my website
Re: Increasing memory
« Reply #3 on: March 29, 2008, 10:25:51 AM »
Looks like the memory from my host is only 16M  :-[, so did I waste $19.99? even though I did read the installation requirements or is there any progame were I can get a sitemap for my site like the free one here
« Last Edit: March 29, 2008, 10:35:09 AM by auctions3 »
Re: Increasing memory
« Reply #4 on: March 30, 2008, 09:20:18 PM »
Hello,

please PM me your generator URL with an estimated total number of pages on your site so that I can check it.
Re: Increasing memory
« Reply #5 on: April 02, 2008, 01:28:21 AM »
The answer to your problem is simple. Run the script alone on a home/office server. I have been running the script for months on siteground shared hosting with no luck. It was only when I had read a reply from admin in the forum that I realized the script did not have to be on the same server as the website being crawled.

I have used Xampp on a windows machine, I had to change the php.ini file in about 5 locations to increase max _execution_time and memory. I just reset file locations and names to suit my site in the configuration. Then uploaded the completed sitemap to my host.

Apart from  taking up a massive amount of memory and time, the script runs sweet! I assume its taking some time to crawl due to limitations from my shared hosts allowed amount of http requests!

I hope this has been helpful to anyone using shared hosting!

Dan
Re: Increasing memory
« Reply #6 on: April 05, 2008, 10:33:48 PM »
Hello,

Please provide specific instructions on how to edit the (php.ini file on the server).

memory_limit and max_execution_time settings should be increased in PHP configuraiton on your host (php.ini file on the server).
Re: Increasing memory
« Reply #7 on: April 07, 2008, 08:53:56 PM »
Thanks Dan that had crossed my mind, and wasn't sure if it would work , will try that now
Re: Increasing memory
« Reply #8 on: April 09, 2008, 10:27:55 PM »
Thanks that did the trick, ;D

 I don't know why the php.ini file is stuck inside the Apache bin folder? and your right the it does take up a massive amount of memory and time but no errors with google so thumbs up
Re: Increasing memory
« Reply #9 on: April 12, 2008, 12:38:28 PM »
Hey Lottoplus

Good to see it worked for you. I just thought I would add that i found the  sitemap files written to the data folder after the crawl was complete. Dunno why but they are there and thats all that matters.

I would urge Oleg to warn all his customers who use shared hosting, that it is pointless using this script on a shared host. I wasted two months of my time before I realized this. Shared hosts just do not allow you to configure the server the way the script needs.

Oh I had also thrown in a second hard disk on the xampp sitemap server machine i am using. This was to get the server to use the second disk as a page file to free up recourses.  I have 2gb physical memory installed on the machine.

To this day I no longer have issues with this script since running my own server. Oh another good piece of advice is to lower the amount of urls written to the sitemap xml.gz files as I had found the sitemaps i had built had size errors on google. I changed this in the config.inc.php file. look for xs_sm_size and changed it  to 40000 . When I had done this the sitemap was read by google no issues.

Thanks

Dan
Re: Increasing memory
« Reply #10 on: April 16, 2008, 01:04:32 PM »
Hi guys, Im on a service where I can add my own memory setting but it still is taking up too much.

I wonder if there are any ways to reduce the memory requirement of the application. If I reduce the depth of scan I understand that reduces it, is there any other ways? like only generating one sitemap file per scan? or increasing the time it takes to run the script?

Thanks


Re: Increasing memory
« Reply #11 on: April 21, 2008, 12:25:48 PM »
Hi guys, Im on a service where I can add my own memory setting but it still is taking up too much.

I wonder if there are any ways to reduce the memory requirement of the application. If I reduce the depth of scan I understand that reduces it, is there any other ways? like only generating one sitemap file per scan? or increasing the time it takes to run the script?

Thanks

I have tried to do many things to reduce memory usage on my machine.  But it seems the script is hungry for memory! I have 2gb memory available and a page file to the max and still the script will chew 950,936k on average. I just hope Oleg works to improve the memory issues with the script in further version releases.

Re: Increasing memory
« Reply #12 on: May 20, 2008, 01:31:53 PM »
I have the same shared hosting issues.

HOW DO YOU "RUN ON A HOME/OFFICE SERVER?"
Re: Increasing memory
« Reply #13 on: May 20, 2008, 02:15:40 PM »
buy pc, load linux install sitemap generator, plug into Internet point sitemap gen at your URL, upload the resulting xml to your root domain folder


Re: Increasing memory
« Reply #14 on: May 21, 2008, 01:59:00 PM »
If we were to change the permissions and allow the home server to write to the xml files, will that work?  I did try this last week before reading this thread.  I have a simple box with CentOS 5.1 installed,  I changed all the values to the ones on the "real" server.  LOL..After 200 hours, I had to stop it.  Problem that I have is that I got a video script in a subfolder that pulls the videos from youtube.  This seems to be causes the lengthy process.  Also one difference I saw, as the home server crawled the main site, I saw the process in the window.  But no process seen if I run the script on the main site.