: Allow crawl bots to only see articles and not index page? I'm currently trying to improve my search engine results. I got a index page that lists all articles with a little teaser and then
I'm currently trying to improve my search engine results. I got a index page that lists all articles with a little teaser and then the article pages themselves.
Now when I e.g. google "websiteXXX rails development", it shows me the index page and highlights the one article that has these words in the teaser. But I would rather have the search engine show the article itself and direct link to it.
What is better to do now? - Disallow the serach engine to index the index page (hence, only article pages) - Give the index page a priority of 0.5 and article pages 1.0 - Something else?
Thanks!
More posts by @Alves908
2 Comments
Sorted by latest first Latest Oldest Best
You are going about this the wrong way. You want the bots to crawl your index page (s). This helps them find the rests of the links on your site.
For best results, you need to segment your post feed landing pages by subject and fill them up with articles that are closely related. This is good for both SEO and User Experience.
Lastly, search engine should be able to index your articles and landing pages both. If the only index the landing pages then you might have a bigger problem. Poor quality content and some technical SEO issues.
When a end-user shows up on your index page; they will quickly scan for the content they originally searched for. When they cannot find the content they will leave. Your index page probably has a high bounce rate.
I would not suggest excluding your index page or changing its importance. Instead consider programing the site to consume the referring page search string and have your index automatically search your site for the revenant (searched) content. If one cannot get this to work then change the importance.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.