I noticed that SG is slowing down when indexing more pages. For 4 days is scanned about 40K pages, but for last day only 500. I would like to index over 200K pages, but i don't know when it finish indexing.
Pages added to sitemap: 40375
Pages scanned: 40540 (847,164.3 Kb)
Pages left: 103579 (+ 44164 queued for the next depth level)
Time passed: 4288:38
Time left: 10957:25
Memory usage: 64,821.3 Kb
Also very often i have problem to access Crawling page.
In addition i noticed that the script slowing down whole server.
btw. Have you thought about database version, where links could be stored?