Thanks for the assitance. I ran it...but I got this message after 35 minutes or so....
Links depth: 4
Current page: idx/mls-a3940673-7136_presidio_gln_lakewood_ranch_fl_34202
Pages added to sitemap: 5081
Pages scanned: 5860 (137,046.1 KB)
Pages left: 533 (+ 705 queued for the next depth level)
Time passed: 0:35:33
Time left: 0:03:14
Memory usage: 7,794.7 Kb
I am not real knowledgeable with this stuff....but I think I need to find ways to cut down the redundant listings as part of the issue. I looked at the dump log and here is a small piece of it.... It looks like it is scanning all my idx links rather than just the zip code links....
idx/19644-downtown-condos-sarasota-fl-/page-5?idx-d-SortOrders%3C0%3E-Column=Price&idx-d-SortOrders%3C0%3E-Direction=DESC',
145 => 'idx/page-3?idx-q-Counties=Sarasota&idx-q-DistressTypes=1&idx-q-PriceMax=200000&idx-q-PriceMin=100000&idx-q-PropertyTypes=180',
146 => 'idx/page-8?idx-q-Counties=Sarasota&idx-q-DistressTypes=1&idx-q-PriceMax=200000&idx-q-PriceMin=100000&idx-q-PropertyTypes=180',
147 => 'idx/page-3?idx-q-Counties=Sarasota&idx-q-DistressTypes=1&idx-q-PriceMax=300000&idx-q-PriceMin=200000&idx-q-PropertyTypes=180',
148 => 'idx/64135--short-sale-homes-in-sarasota/page-4?idx-d-SortOrders%3C0%3E-Column=Price&idx-d-SortOrders%3C0%3E-Direction=DESC',
149 => 'idx/64135--short-sale-homes-in-sarasota/page-11?idx-d-SortOrders%3C0%3E-Column=Price&idx-d-SortOrders%3C0%3E-Direction=DESC',
150 => 'idx/mls-m5798223-5002_e_18th_st_bradenton_fl_34203',
151 => 'idx/mls-m5802622-4420_sanibel_way_bradenton_fl_34203',
One problem (I am thinking) with that is that I can have duplicate listings (many times over in some cases) in the sitemap if I don't stick to one method of pulling listing results.
For instance...idx/19644-downtown-condos-sarasota-fl-/page-5?idx-d-SortOrders%3C0%3E-Column=Price&idx-d-SortOrders%3C0%3E-Direction=DESC', is one of the examples, which is pulling properties for Downtown condos. but then idx/page-3?idx-q-Counties=Sarasota&idx-q-DistressTypes=1&idx-q-PriceMax=200000&idx-q-PriceMin=100000&idx-q-PropertyTypes=180', is pulling condo distressed sales between 100-200k price range.... then another link could be pulling completely different condo criteria. The same listings could meet a ton of different search link criterias that I have on my site. I have 400 pages of real estate listings sorted by all different criterias. lots of duplicates I am sure due to marching different criteria.
Considering I am having some sort of timeout or memory issue (or maybe a differnet issue causing it to cease after 30 minutes or so) ....I think part of the resolution is to try for efficience in the scanning....the most efficient way to go is to just scan zip codes because then every property will only be counted just once. Is there a way to do that?
I have about 30 zipcodes, and as an example, here is a link that would pull all properties in zipcode 34201
[ External links are visible to forum administrators only ]
As far as that message, I have 64MB memory limit with Hostgator server. Is that the issue with it stopping after 30 minutes or so? They say that I get 64 MB memory limit, and it doesn't appear I have used that from the message. THey say I can not adjust the time out,a nd that it is set for 30 seconds! It obvioulsy appears I am running a lot longer than 30 seconds...more like 30 minutes before it stops....so I am unsure what is causing it.
Also, Another person I know uses this with property listings, and said to make sure it indexes the links, but doesn't crawl them because it takes too long to do. Does that make sense, and of so, how do I do that?
My goal is to get all the 10,000-15,000 or so dynamic listings to rest in the sitemap.
Thanks so much for the assistance!!!! I really appreciate it