XML Sitemaps Generator

Author Topic: Generator stops every time  (Read 13570 times)

hofmann.schneeberg

  • Registered Customer
  • Approved member
  • *
  • Posts: 2
Generator stops every time
« on: October 30, 2008, 11:16:24 PM »
Hello !

At first: my english is very bad - i hope you understand me....

I don`t know what to do: everytime the Generator stops with this note :

Already in progress. Current process state is displayed:
Links depth: 2
Current page: suche/Biermaxx.html
Pages added to sitemap: 259
Pages scanned: 260 (8,976.2 KB)
Pages left: 4328 (+ 5294 queued for the next depth level)
Time passed: 4:26
Time left: 73:58
Memory usage: 3,641.6 Kb


I have the payed generator.
What can i do ?

THX,
Frank

XML-Sitemaps Support

  • Administrator
  • Hero Member
  • *****
  • Posts: 10624
Re: Generator stops every time
« Reply #1 on: October 30, 2008, 11:59:59 PM »
Hello,

it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.

hofmann.schneeberg

  • Registered Customer
  • Approved member
  • *
  • Posts: 2
Re: Generator stops every time
« Reply #2 on: October 31, 2008, 01:56:51 AM »
Hello !

Thank you for answer !
What are the best settings ? (for a very big site - Millions of URL`s...)

THX,
Frank

XML-Sitemaps Support

  • Administrator
  • Hero Member
  • *****
  • Posts: 10624
Re: Generator stops every time
« Reply #3 on: November 02, 2008, 08:45:23 AM »
Hello,

with website of this size the best option is to create a limited sitemap - with "Maximum depth" or "Maximum URLs" option limited so that it would gather about 200-300,000 URLs, which would be main pages representing "roadmap" sitemap for search engines.

The crawling time itself depends on the website page generation time mainly, since it crawls the site similar to search engine bots.
For instance, if it it takes 1 second to retrieve every page, then 1000 pages will be crawled in about 16 minutes.

Some of the real-world examples of big db-driven websites:
about 35,000 URLs indexed - 1h 40min total generation time
about 200,000 URLs indexed - 38hours total generation time

With "Max urls" options defined it would be much faster than that.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.

 

SMF 2.0.12 | SMF © 2014, Simple Machines
XHTML RSS WAP2