1 page deep
« on: February 13, 2007, 02:15:59 AM »
Hi, I looked through the forums to find several single page issues - unfortunately none that resolve my issue.
The site is using Suckerfish list menu as well as straight text links.  I am not using a robots.txt file but am using <meta> tags.  Would you be so kind as to have a look and let me know.
[ External links are visible to forum administrators only ]
Thank you,
D.
Re: 1 page deep
« Reply #1 on: February 13, 2007, 11:43:48 PM »
Hello,

there is something on your site that doesn't display the page for the bots. Here is how your homepage is seen by google:
https://www.xml-sitemaps.com/se-bot-simulator.html?op=se-bot-simulator&go=1&pageurl=http%3A%2F%2Fnwtreespecialists.com%2F&se=googlebot&submit=Start

you can check this in gogole cache: [ External links are visible to logged in users only ]
Re: 1 page deep
« Reply #2 on: February 15, 2007, 01:33:50 AM »
Related to the simulation, all of the page content is present including the links.  The site is ColdFusion - I have CF sites with gt 500 pages deep which I have had no issue.  There are three sets of links on each page, a list-menu set (Suckerfish) at the top, a plain text set contained within the body content, and an included set provided by include through custom tag (also have not experienced issue previously).  Actually, all of the text content found in the custom tag appears in the simulation output, as a matter of fact the simulation shows all three sets of links as mentioned above, so regardless of static or include the links are present in the simulation????

Below the page layout section all of the site's links, both internal and external are identified.

The Google cache is showing the previous site.  New site propagated on Monday the 12th, so Google has not even crawled yet.

Ok, just ran the tool again and Success - Call me crazy but maybe I tried too early and the tool was trying to access the old domain server which was a .swf site and would only be recognized as one page.
-Issue Resolved- Lesson learned - Wait one additional day after propagation to run the sitemap generator tool.

Thank you for looking at that!
D.
Re: 1 page deep
« Reply #3 on: February 15, 2007, 02:38:43 PM »
Quote
The Google cache is showing the previous site.  New site propagated on Monday the 12th, so Google has not even crawled yet.

Ok, just ran the tool again and Success - Call me crazy but maybe I tried too early and the tool was trying to access the old domain server which was a .swf site and would only be recognized as one page.

I see. Most likely it means that your domain didn't propagate for our server (where the script is running) at the moment when you tried it first time.
I'm glad you got it working! :)