: Google is not indexing my entire site despite having a sitemap I have an e-commerce website www.beyondtime.in. I have been constantly monitoring Googlebot crawling on my website and my webmaster
I have an e-commerce website beyondtime.in. I have been constantly monitoring Googlebot crawling on my website and my webmaster account. Lately, I have found two issues that I have not been able to understand.
1.) The Google Bots have been only crawling beyondtime.in/telecom.php when the URL is not even valid. What needs to be done to let Google crawl other pages of the website as well?
2.) The second question is about the Google Webmaster account, where I've submitted my sitemap with 227 URLs. Out of that, only 156 have been indexed. None of the images of my website have been indexed by Google.
More posts by @Fox8124981
1 Comments
Sorted by latest first Latest Oldest Best
Your images are probably not indexed because you block the bot in your robots.txt file to crawl directory /images.
At first glance I would also add a User-agent: * line above the Allow: / line.
Also add the line
Sitemap: www.beyondtime.in
to your robots.txt file (to let more search engines detect it).
Double check your webmastertools (under section Status > Blocked Urls) say nothing is blocked.
You have a low (a very low) number of backlinks so far (as verified by Opensiteexplorer. That and in combination with low traffic can lead to slow indexing of all your pages. A quick check with this SEO spider tool results in good looking status codes.
One BIG Problem I see is that almost every page has a rel=canonical directive linking to the homepage. That reads like "Almost every page is a duplicate of the homepage, so ignore it (the page with the canonical-reference).
(If this is all fixed I would double check if my sitemap.xml lists all pages and if the pages allow indexing and following).
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.