: Why is Google Webmaster Tools account showing fewer indexed pages than a site: search? When I have check my website index pages it is showing only 252 pages are indexed but when I checked
When I have check my website index pages it is showing only 252 pages are indexed but when I checked through Google Search by typing site:http://www.example.com then is showing more then 900 pages are indexed.
What is this issue and even on Webmaster Tools I have submitted XML sitemap and number of pages are submitted 900+ but why index is showing low?
More posts by @Ogunnowo487
3 Comments
Sorted by latest first Latest Oldest Best
You can't trust the number of results you get when you do a site: search. The numbers from Google Webmaster Tools will be much more accurate.
First, the number of results returned by Google for any search (not just site searches) is often wildly innacurate. The linked article goes into a lot of detail about the reasons for this, and includes the following quote from Google's Matt Cutts:
We try to be very clear that our results estimates are just that–estimates. In theory we could spend cycles on that aspect of our system, but in practice we have a lot of other things to work on, and more accurate results estimates is lower on the list than lots of other things.
For site: searches, Google tries to make the results somewhat less informative on purpose. This is done to prevent people from divining too much knowledge about Google and the websites they index by doing specially crafted searches. In this case, Google has decided to show the number of indexed pages on your site only to you, the webmaster, through their authenticated webmaster console. They show a inaccurate number to the public on purpose. They believe that the number of indexed documents on your site is for you to know and Google to know, but it should be your secret.
This has to do with a combination of things. Your sitemap doesn't really tell Google what to index. It's more of a guide for the spiders than anything else. In fact, if you didn't have a sitemap, but you did submitted your website to Google, it would still be crawled.
The problem with having site:domain.com showing you so many, it's because, to the search engine spiders, anything and everything that could potentially be opened with an url, is a page. This is most frequent with WordPress sites. For example, if you have a woocomerce WordPress, technically every single product can be open in its own page. It would look like this :
example.com/category1/products/product1/variation1
So if you had 40 products divided in 5 categories, and each product has 3 color variations or sizes. You will have 125 pages that are not really in your site structure or even in your sitemap. 120 of them may be of good use for you, but the 5 category pages are useless.
do some work in your robot.txt file. make a notation from your site:domain.com and disallow all 404s and all pages that provide no SEO value to you. (e.g. terms and condition, privacy policy, category pages, tag pages, a picture that shows in its own page when clicked, etc)
Do the urls which are indexed exactly match the urls on your sitemap ? If so, they should be counted within the ones 'indexed'.
If you have a mismatch of hundreds then try to compare the urls to your sitemap urls. It could be that Google is indexing different urls than what you have submitted, maybe something like wrong version of the website or irrelevant urls which you didn't want indexing like search or calendar pages.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.