Mobile app version of vmapp.org
Login or Join
Cooney921

: How to get Google to index site site search result pages? I'm looking for a white-hat method of getting Google to show a local site search result as a Google search result. For clarification:

@Cooney921

Posted in: #SearchEngines #SearchResults #Seo #SiteSearch

I'm looking for a white-hat method of getting Google to show a local site search result as a Google search result.

For clarification:

There is a page filled with names of certain people. Each of the names is linked to the local site search engine. So clicking on David jones would go to mysite.com/?q=david+jones. I want Google to show up the aforementioned link mysite.com/?q=david+jones as a search result if something like mysite david jones is queried.

There is an obstacle I need to avoid:


There are more than 450 people names (or links) on the
aforementioned page. I've heard that having more than say 150 links
on the page is bad for SEO. In addition to those names there are
other links to various other pages. i.e main menu, footer links,
latest article links etc. (it's a Joomla system.)


What I want to try:

My solution to this is use a robots tag to index only content and not links. But I'm still stumped how to show the site search result as a Google Search result.

Will adding all these local site search engine links in sitemap help?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Cooney921

1 Comments

Sorted by latest first Latest Oldest Best

 

@Gretchen104

You do not want to show your local site search results to Google to be indexed. First of all, as John Conde stated, Google doesn't necessarily want site search results in the index and, frankly, you don't want to display a huge page of links to Google as it will appear spammy to the algorithm under nearly all circumstances.

As far as the authoritative source than John was unable to locate, I think there are several that serve. The first is an old Matt Cutts post from 2007 that mostly describes the problem but does quote Vanessa Fox responding to a question on Webmaster Help thusly:


Typically, web search results don’t add value to users, and since our
core goal is to provide the best search results possible, we generally
exclude search results from our web search index. (Not all URLs that
contains things like “/results” or “/search” are search results, of
course.)


Cutts then goes on to point to the quality guidelines on the official Webmaster Guidelines support page that was modified to include the following bullet:


Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.


Cutts further states in the 2007 post:


it’s still good to clarify that Google does reserve the right to take action to reduce search results (and proxied copies of websites) in our own search results.


So by now it should be abundantly clear that this is a practice to avoid and has been for quite some time (more than 6 years at the time of this answer).

In case you need more proof, there is a post on Search Engine Land from September of 2013 featuring a video of Matt Cutts basically answering the same question as he did in 2007 but also adds a link to the Automatically generated content article on Webmaster Tools Help that basically restates all of the above, only much more succinctly than I have done.

tl;dr

Don't do this.

What you should be doing instead is making sure you have actual content pages for David Jones et al and have those in your sitemap so Google will index them. A local site search is really just another navigational tool for your users once they are on the site...it is not a destination for inbounds.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme