XML Sitemaps Generator

Author Topic: the crawling seem to be stuck in high depth  (Read 14536 times)

rutigolds

  • Registered Customer
  • Approved member
  • *
  • Posts: 4
the crawling seem to be stuck in high depth
« on: June 16, 2009, 11:46:48 AM »
Hi,

I have several pages in my site that displays rows from big tables by pager,
when pressing Next button its goes to next 10 pages in the pager,

I tryed to run the crawling five times at list, and every time when he reach to level 4/5/6 he seem to be stuck.

so I try to add those URLs to the exclude URL. and Its really helped.

so now for the first time he finished crawling.

but I really want him to be able to do it..

so I can make him not to be stuck?

(Or maybe to crawl just the first and second depth in this URL?)

thanks!




XML-Sitemaps Support

  • Administrator
  • Hero Member
  • *****
  • Posts: 10624
Re: the crawling seem to be stuck in high depth
« Reply #1 on: June 16, 2009, 05:34:00 PM »
Hello,

it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.

rutigolds

  • Registered Customer
  • Approved member
  • *
  • Posts: 4
Re: the crawling seem to be stuck in high depth
« Reply #2 on: June 17, 2009, 06:03:50 AM »
thanks, I'll try it..


 

SMF 2.0.12 | SMF © 2014, Simple Machines
XHTML RSS WAP2