I'm trying to figure out the best way for us to build our site map... Should I buy this script or would running Google's Python on our server make more sense?
We have a PHP site that builds maybe 500,000 pages dynamically. Many of these pages have not been hit by users yet, so they might not be in our log files... However, every page most likely has had a link created by a page that has been visited... So maybe there is a link in a log file??? Although most of these pages get updated very frequently, once indexed, they only need to be respidered every once in awhile... month or couple months.
We probably have about a million pages or more on the site in total... I'm really only concerned about half of them as I don't think that Google would ever serve up the other half from random searches.
I don't mind paying a server admin to instal the Python script... And I wouldn't mind buying this product...
I just want to do what is best for our site. I'm also not worried about a site map for users... Only to make sure that Google finds all of the pages that we want it to.
And how long would it take to generate for a site this size?