the crawling seem to be stuck in high depth
« on: June 16, 2009, 12:46:48 PM »
Hi,

I have several pages in my site that displays rows from big tables by pager,
when pressing Next button its goes to next 10 pages in the pager,

I tryed to run the crawling five times at list, and every time when he reach to level 4/5/6 he seem to be stuck.

so I try to add those URLs to the exclude URL. and Its really helped.

so now for the first time he finished crawling.

but I really want him to be able to do it..

so I can make him not to be stuck?

(Or maybe to crawl just the first and second depth in this URL?)

thanks!



Re: the crawling seem to be stuck in high depth
« Reply #1 on: June 16, 2009, 06:34:00 PM »
Hello,

it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.
Oleg Ignatiuk
www.xml-sitemaps.com
Send me a Private Message

For maximum exposure and traffic for your web site check out our additional SEO Services.
Re: the crawling seem to be stuck in high depth
« Reply #2 on: June 17, 2009, 07:03:50 AM »
thanks, I'll try it..