trying to block certain pages from being indexed
« on: March 09, 2010, 11:27:41 PM »
I have a site that I want to exclude certain urls from the xml sitemap.  I have tried to add part of the url string that I do not want indexed to the exclude url list, but it isn't.  Not sure if i am doing something wrong or if this is outside of the possible parameters. 

For example I have domainname.com/folder1/results.html that I do want included, but there are multiple pages in that results page that I do not want added for now as the site is just far too large.

So I do not want to include: domainname.com/folder1/results_page1.html, domainname.com/folder1/results_page2.html, domainname.com/folder1/results_page3.html, etc etc.

I thought if I added "_page" (w/o quotes of course) to the exclude url list it would not spider those, but this doesn't seem to be the case.  Does anyone have any suggestions on what I can do.  Do I need to add like a * parameter to end of _page, like _page* ?  Is this even possible?

Thanks in advance for any assistance.
Re: trying to block certain pages from being indexed
« Reply #1 on: March 10, 2010, 03:04:09 PM »
Hello,

it should work as you described with "_page" in "Exclude URLs" setting, did you regenerate sitemap after that?
Re: trying to block certain pages from being indexed
« Reply #2 on: March 10, 2010, 05:13:12 PM »
Yes I regenerated, but it still kept spidering the additional pages such as _page1.html, _page2.html, _page3.html, etc
Re: trying to block certain pages from being indexed
« Reply #4 on: March 17, 2010, 08:15:13 PM »
Actually I figured out my problem I was asking to exclude '_page' when it was actually an uppercase P like this '_Page'

Thanks for the time you spent anyway.