Mobile app version of vmapp.org
Login or Join
Margaret670

: Create a way for Googlebot to find the deep pages in my site when users would find these pages via site search I have a website like Foursquare, where users search for business in a specific

@Margaret670

Posted in: #Googlebot #Seo

I have a website like Foursquare, where users search for business in a specific location. The users can click on markers inside the map to view details about the companies/stores, and these details are opened in a dedicated page to each business.

I don't want users to go to these detail pages from an index page with a bunch of links for each business names, but I need to make this pages visible to Googlebot. How can I do this? Can I create an index page visible to Google only?

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Margaret670

3 Comments

Sorted by latest first Latest Oldest Best

 

@Rivera981

It appears that you're missing the point of how Google works. If you want those pages to be found and crawled by Google, then you should want them found and crawled so that they can appear in the search results.

The logic then follows that if they display/appear in Google's search results then they are accessible to users, who will click on the links and land on those pages.

Now, you could "noindex,follow" those pages using the robots meta tag, so that they are crawled but don't get indexed. This may provide some internal link relevance that helps your site.

Any other method that provides content only for Google, and not for users, will be quickly acted on, either algorithmically or manually by Google, as it goes against their T's & C's.

10% popularity Vote Up Vote Down


 

@Heady270

One way to do this would be with sitemap pages. These pages would have a list of links to all the business pages on your site. While you can't actually hide these pages from users, you can make them less prominent. For example you could link to them from the footer.

Another way to let Googlebot know about your deep content is through XML sitemaps. Your XML sitemap can be submitted to Google through Webmaster Tools and users will never know about it.

The problem with using either HTML sitemaps or XML sitemaps is that they don't do a great job of passing PageRank to your deep pages and getting those pages ranking well. While your sitemaps may get the pages indexed, for good rankings, you need some way of passing link juice to those pages as well. One way of doing so would be to link your business pages to other business pages. Some sort of "related businesses" list of links on those pages would be effective.

10% popularity Vote Up Vote Down


 

@Kristi941

No, you can't do this. This is the very definition of black hat SEO. This is exactly what the search engines don't want websites to do. If you do this you will get penalized up to and including having your site banned from Google. This is a very bad idea.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme