Crawler had captured 4600+ pages from my site. I have around 1300 content pages. crawler must be looping through interface links(purchase cart ect) I want to have deep clean sitemap of only my content pages (1300 html) My home page is PHP and rest of site is dynamic using PHP,Mysql, Ajax,Iframes(2). HTML pages(static) were created for search purposes only. Site users access content through dynamic pages.
How to configure to capture all my content with meta info. Should be around 1300 html pages with inner links to page content.