Hello
I bought the Stand Alone Sitemap Generator a few week back. Since then i have been trying to generate a site map for my website that has a huge nubmber of pages. The generation stops after some few hours (i have no clue as to how many hour as i always let the generator run in the background and i close the window). And when i check it after sometime i see it has stopped. Why does it stop ?
I have set the follwoing :
Maximum pages: 0
Maximum execution time, seconds: 9000
Save the script state, every X seconds: 600
Make a delay between requests, X seconds after each N requests: (left this blank as i dont know what to do).
And when i login in and see that the generation has been stopped, i click on the checkbox for :
"Do not interrupt the script even after closing the browser window until the crawling is complete
Resume last session" & "Continue the interrupted session (yyyy-mm-dd 16:17:48)" and the generation starts.
I even tried to setup a Cron job with the command line provided by the software. First time when the cron job was executed , i got an email that stated "/bin/sh: /usr/bin/php: No such file or directory". So i removed the line "/usr/bin/php" from the command line and then i got the error that stated "/bin/sh: /hsphere/local/home/abcde/mysite.com/generator/runcrawl.php: Permission denied
". So i thought this maybe because i had setup a username and password in the configuration screen. I removed the username and password for open access to the script. Still i got the same Permission Denied error.
What to do for the above problems ?
Had the scipt run continuously, i guess my sitemap would have been ready by now.
By the way where is the site map that is generated being stored. I dont see any xml file in the specified path, why ? I think that when there is a facility to resume a broken session of sitemap generation, then there is bound to be a file that saves the partial generated sitemap. Please explain.