Force new crawl rather than resume last crawl
« on: December 03, 2013, 09:48:13 PM »
I am running the site crawler from the command line and I want to force it not to resume any previous crawl.  How do I do this?

At the moment the last time a crawl actually happened was 2 days ago, and each time I try to re-crawl the site I get a message saying it is resuming the previous crawl.  I want it to crawl the site each time it is run.  I have tried setting the "Save script state" value to 0 but this has no effect.  The xs_autoresume value was blank in the generator.conf file so I tried it with this blank value and also with 0 but it made no difference.  I tried recycling the appPool (IIS7) to get a new process and memory but this made no difference.  I also tried setting debug to true but again I saw no difference in the output.

See attached image of script start up message.