XML Sitemaps Generator

Author Topic: Crawling Page broken  (Read 10044 times)

support49

  • Registered Customer
  • Jr. Member
  • *
  • Posts: 43
Crawling Page broken
« on: February 19, 2010, 06:04:14 PM »
After a week of sitemap generator's crawling the site I went to check on the progress today and the crawling page appears to be missing some parts. I've attached a screenshot.

Also, no sitemap file was created.
« Last Edit: February 19, 2010, 06:07:01 PM by support49 »

XML-Sitemaps Support

  • Administrator
  • Hero Member
  • *****
  • Posts: 10624
Re: Crawling Page broken
« Reply #1 on: February 19, 2010, 11:48:00 PM »
Hello,

Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.

support49

  • Registered Customer
  • Jr. Member
  • *
  • Posts: 43
Re: Crawling Page broken
« Reply #2 on: February 20, 2010, 12:52:21 PM »
PHP Fatal error:  Allowed memory size of 536870912 bytes exhausted (tried to allocate 116906983 bytes)
What do you suggest?


 How do I get the start button back?

XML-Sitemaps Support

  • Administrator
  • Hero Member
  • *****
  • Posts: 10624
Re: Crawling Page broken
« Reply #3 on: February 20, 2010, 06:56:12 PM »
"Run" button is not displayed because memory limit is exceeded when generator tries to read the log dump to show current progress, try to increase memoyr_limit even more.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.

support49

  • Registered Customer
  • Jr. Member
  • *
  • Posts: 43
Re: Crawling Page broken
« Reply #4 on: February 22, 2010, 05:21:52 AM »
OK, but is there a way to force the creation of the sitemap files based on what's currently in the crawl_log rather than waiting for it to finish?

XML-Sitemaps Support

  • Administrator
  • Hero Member
  • *****
  • Posts: 10624
Re: Crawling Page broken
« Reply #5 on: February 22, 2010, 12:39:13 PM »
Hello,

yes, you should limit "maximum URLs" setting to tell generator not to crawl pages further, but you still need to increase memory_limit so that generator can read large dump that was already created.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.

support49

  • Registered Customer
  • Jr. Member
  • *
  • Posts: 43
Re: Crawling Page broken
« Reply #6 on: February 22, 2010, 01:58:27 PM »
You can change the config while it's crawling?

support49

  • Registered Customer
  • Jr. Member
  • *
  • Posts: 43
Re: Crawling Page broken
« Reply #7 on: February 23, 2010, 06:37:45 PM »
It finished crawling, but got errors about sitemap2.xml, sitemap3.xml, etc. not being writeable. Is it possible to recreate the sitemap files from the log file?


support49

  • Registered Customer
  • Jr. Member
  • *
  • Posts: 43
Re: Crawling Page broken
« Reply #8 on: February 23, 2010, 08:52:12 PM »
sitemap1.xml works, but I get this message in Mozilla:

Parse error: syntax error, unexpected T_STRING in /home/xxxxxx/public_html/sitemap2.xml  on line 1

but the first line is exactly the same in both files. Any idea what would cause this?



######################\\

this seemed to fix the problem...

<Files ~ "sitemap2.*\.xml">
AddType application/xml xml
</Files>
« Last Edit: February 23, 2010, 09:03:00 PM by support49 »

XML-Sitemaps Support

  • Administrator
  • Hero Member
  • *****
  • Posts: 10624
Re: Crawling Page broken
« Reply #9 on: February 23, 2010, 11:13:42 PM »
Great, I'm glad it's working now.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.

 

SMF 2.0.12 | SMF © 2014, Simple Machines
XHTML RSS WAP2