: Duplicate page content and the Google index I have a static pages with dynamically expanding content that google is indexing. I also have deep links into virtually duplicate pages which will
I have a static pages with dynamically expanding content that google is indexing. I also have deep links into virtually duplicate pages which will pre-expand the relevant section of content into the relevant section.
It seems like Google is ignoring all my specialized pages and not putting them in the index. Even after going through web-masters tools, crawling and submitting them to the index manually.
I also use the google API for integrating search on the site, and the deep linked pages won't show up. Is there a good solution for this?
More posts by @Goswami781
1 Comments
Sorted by latest first Latest Oldest Best
Indexing rate is a function of how often the content changes, how valuable the content is in the eyes of search engines and how many valuable backlinks there is to your site. If some of your content is duplicate or near duplicate, then it is unlikely to be indexed, no matter what. You need to use canonical wherever possible for such content.
Besides, get more backlinks and traffic to your site if possible. I am also assuming your site is not suffering from one of the typical issues preventing it from being indexed.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.