Please help! The bots are not seeing my site!
« on: July 16, 2012, 08:03:35 AM »
When I try to make an xml site map like at xml-sitemaps.com, it only sees 2 pages, which are .pdf's, not my other 20 main pages. Now the weird thing is that I have always used this site (for a couple years) and it has always seen all my pages.

So the question, obviously, is what happened to cause it not to be able to see my site any more? To my knowledge I have not changed anything but I have to suspect that I must have changed something by mistake, unknowingly.

I do have a sitemap.html on my site and in my MENU with all my links and I now (since this problem arose) I have created my own .xml sitemap (though not sure it's made correctly).

However, someone at xml-sitemaps sent me this:

"You can see how search engine robots view your site at https://www.xml-sitemaps.com/se-bot-simulator.html?op=se-bot-simulator&go=1&pageurl=http%3A%2F%2Fwhatsthebestwa terfilter.com%2F&se=googlebot&submit=Start"

If anyone can help me figure out what is wrong with my site that is causing the search engines not to see it, I'd really appreciate it.

Update:
My robots.txt file is not the problem.
This is my robots.txt file:

"
# Allows all robots

User-agent: *
Sitemap: [ External links are visible to forum administrators only ]
Disallow:
"

I just added the Sitemap line in there after discovering this problem, but it still did not give me a proper site map when trying to get one at Create your Google Sitemap Online - XML Sitemaps Generator and others which USED TO WORK JUST FINE for this exact site.

(I've tried a few different sitemap creators and all of them now show that I have just 2 pages, and both are pdf's! (?!)):

"
<?xml version="1.0" encoding="UTF-8"?>
<urlset
xmlns="[ External links are visible to forum administrators only ]"
xmlns:xsi="[ External links are visible to forum administrators only ]"
xsi:schemaLocation="[ External links are visible to forum administrators only ]
[ External links are visible to forum administrators only ]">
<!-- created with Free Online Sitemap Generator Create your Google Sitemap Online - XML Sitemaps Generator -->

<url>
<loc>[ External links are visible to forum administrators only ]</loc>
</url>
<url>
<loc>[ External links are visible to forum administrators only ]</loc>
<lastmod>2011-04-11T07:34:29+00:00</lastmod>
</url>
<url>
<loc>[ External links are visible to forum administrators only ]</loc>
<lastmod>2011-04-03T06:35:50+00:00</lastmod>
</url>
</urlset>
"

So for SOME unknown reason the bots are only seeing these 2 pages, NOT the 19 htm and html pages that are in my /site_map.htm

Re: Please help! The bots are not seeing my site!
« Reply #1 on: July 17, 2012, 08:34:17 AM »
Hello,

the navigation menu on your site is created with javascript (menu.js), that's why it's not visible to bots.
Re: Please help! The bots are not seeing my site!
« Reply #2 on: July 17, 2012, 10:46:47 PM »
If that's the case (and I don't doubt your expertise) then why did it work fine in past creation of .xml sitemaps here on this site, with the exact same menu?

Someone told me to create the menu with html instead but I'm not sure how to do that as this is a template that uses java and css and I'm not sure how to change it. Is there a way around this? I do have a sitemap.html page but apparently that isn't good enough. I see that now that I've created my own xml sitemap and submitted it to Google,  G is indexing 9 of my 20 pages but pretty sure that in the past - with an xml sitemap created by this site - all pages were found, and more than 9 were indexed.

Also, why on earth is xml-sitemaps only finding 2 pdf's which are not part of the javascript menu at all? If it were to find something, why that??
« Last Edit: July 17, 2012, 10:50:32 PM by seosoldier »
Re: Please help! The bots are not seeing my site!
« Reply #3 on: July 18, 2012, 09:07:14 AM »
Hello,

looks like you can use [ External links are visible to logged in users only ] as Starting URL for generator (this page has plain html links to other pages as I see)
Re: Please help! The bots are not seeing my site!
« Reply #4 on: July 19, 2012, 12:49:11 AM »
Thank you for that tip re using my sitemap.html to create my site map using your service! It did indeed create a sitemap of 41 pages which is more accurate.

Am I correct in assuming that a sitemap.xml should in fact have ALL pages of one's site, or would it be better that it not include pages that are not important, pages that are just kind of supplementary information?

And should I create and upload a new site map to my site AND submit it to google any time I add or subtract a page or change a page's title etc.?

Also I guess it doesn't really matter but I am really curious about the answer to  the question I asked before:
Why is it that the xml-sitemaps.com service for creating a site map for my site worked perfectly before, for my site, entering the home page, whereas now doing that results it the software seeing only 2 pdf's? Is it that before your software read java menus and now it doesn't, or ? I am curious because it seems that google also was not seeing my pages, whereas before it did... so I'm wondering what happened.

I am planning to move my site to Wordpress in the future but for now I have to make the best of it as is.
« Last Edit: July 19, 2012, 12:57:56 AM by seosoldier »
Re: Please help! The bots are not seeing my site!
« Reply #5 on: July 19, 2012, 08:48:55 AM »
> Am I correct in assuming that a sitemap.xml should in fact have ALL pages of one's site, or would it be better that it not include pages that are not important, pages that are just kind of supplementary information?

It's recommended to include all pages.

> And should I create and upload a new site map to my site AND submit it to google any time I add or subtract a page or change a page's title etc.?

You need to recreate xml sitemap only if pages is added or removed.

> Why is it that the xml-sitemaps.com service for creating a site map for my site worked perfectly before, for my site, entering the home page, whereas now doing that results it the software seeing only 2 pdf's?

It did not read js menus before, I'd need to check how the site was formatted before to see why it worked.
Re: Please help! The bots are not seeing my site!
« Reply #6 on: July 19, 2012, 10:20:14 PM »
> Am I correct in assuming that a sitemap.xml should in fact have ALL pages of one's site, or would it be better that it not include pages that are not important, pages that are just kind of supplementary information?

It's recommended to include all pages.

> And should I create and upload a new site map to my site AND submit it to google any time I add or subtract a page or change a page's title etc.?

You need to recreate xml sitemap only if pages is added or removed.

> Why is it that the xml-sitemaps.com service for creating a site map for my site worked perfectly before, for my site, entering the home page, whereas now doing that results it the software seeing only 2 pdf's?

It did not read js menus before, I'd need to check how the site was formatted before to see why it worked.

Thanks much for the info.

As to why it worked before I guess it doesn't matter, but it freaked me out because suddenly it was not working any more, and whatever the change was I think google was affected the same way because (after Penguin) suddenly only 2 pages were ranking in the Top 10 pages on google and those were curiously the same 2 that xml-sitemaps picks up now from my index page (.pdf pages).

So it's either a strange coincidence or whatever affected your software also affected google's bots (even though I had a complete xml site map on my site at that time - made by you guys' service).

Anyway, so at least now I have a way to create a sitemap and meanwhile I'm looking into how to change my menu from js to html so maybe that will help google to like my site better.