XML Sitemaps Generator

Author Topic: The crawler page will not load completely?  (Read 18917 times)


  • Registered Customer
  • Approved member
  • *
  • Posts: 6
The crawler page will not load completely?
« on: March 07, 2008, 11:47:52 PM »
Hello Everyone

I have been using XML- sitemaps with no issue until one day it decided to not show the the complete crawl page. The only text i get on the page is.

"Run in background
Do not interrupt the script even after closing the browser window until the crawling is complete"

I cannot crawl my website through the web interface

I have tried to get around this and start a cron job using the following command -
/usr/bin/lynx --dump [external links are visible to admins only]   but still I am unable to crawl my website? Could someone please help me through this issue?
My max_excicution time is 600 sec if that helps!



XML-Sitemaps Support

  • Administrator
  • Hero Member
  • *****
  • Posts: 10599
Re: The crawler page will not load completely?
« Reply #1 on: March 09, 2008, 08:54:20 PM »

it looks like your memory limit is exceeded when sitemap generator tries to read the progress log file, please try to increase memory_limit setting PHP configuration.
Oleg Ignatiuk
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.


SMF 2.0.12 | SMF © 2014, Simple Machines