Mobile app version of vmapp.org
Login or Join
Yeniel560

: Why does google only index a part of my site? For a while now I have been having issues with google indexing my site. For some reason it would'nt take over 33 pages from my xml sitemap:

@Yeniel560

Posted in: #GoogleSearchConsole #Sitemap

For a while now I have been having issues with google indexing my site. For some reason it would'nt take over 33 pages from my xml sitemap: (images not allowed for new users...)

It has been the case for the past couple of months. Why does'nt Google index my entire site? it's only 37 pages!

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Yeniel560

2 Comments

Sorted by latest first Latest Oldest Best

 

@Turnbaugh106

I appreciate this is old but there is a consideration that is easily overlooked.

If you are dynamically creating your sitemap (say php for example). A site can get stuck on 1 url indexed from the sitemap if you don't format the url correctly.

echo'<url><loc>'.$SITE.'/asection/acategory/somepage.html</loc><lastmod>2018-01-14</lastmod><priority>0.7</priority></url>';


Simple enough, right?

now lets say our variable is:

$SITE = 'https://www.icalculator.info/';


All good right? Wrong... when the php is parsed the result is a url like this:
www.icalculator.info//asection/acategory/somepage.html

Note the double // after the domain. This is one of those very silly but incredibly easy mistakes to make. I do occasional SEO reviews for friends / odd contract. the amount of times I have seen that mistake. Anyway, shared now so hopefully if you have the dreaded one page from your sitemap showing in Google Webmaster tools, you can check this.

10% popularity Vote Up Vote Down


 

@Sent6035632

Google doesn't always index everything contained on a website. It is only interested in relevant content and for some reason, it considers 4 pages on your site to be content it's not interested in.

You can create a Webmaster Tools account for your website. In this account you can see if there are crawl errors occurring on these pages, or if you have something in robots.txt telling Google to ignore them. There is even a diagnostic tool that lets you read the page as Google would view it and if that's successful, manually submit the page to Google for indexing (once again dependent on Google considering it to be content they want to index).

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme