Mobile app version of vmapp.org
Login or Join
Gonzalez347

: How to make Google index all sitemap.xml records? I have a sitemap.xml index fine with 3 XML files in it with around 120 pages for indexing. A year passed and I still have 1/3 of it indexed.

@Gonzalez347

Posted in: #CrawlRate #Googlebot #GoogleSearchConsole #Indexing #XmlSitemap

I have a sitemap.xml index fine with 3 XML files in it with around 120 pages for indexing.
A year passed and I still have 1/3 of it indexed. All the most important pages for indexing are in 1st sitemap file, however Google takes a similar amount from each of the sitemap files instead...

How to make Google index all sitemap.xml records?

I have crawl rate set to MAX in GWT.

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Gonzalez347

3 Comments

Sorted by latest first Latest Oldest Best

 

@Jamie184

I am not sure I have this right so here is what I understand.

You have a sitemap file with three sitemap files listed with about 120 pages each. And that these sitemap files are about a year old. I am going to work from that premise.

Given that, this is what I know. If you have only a few hundred pages, then just use one sitemap file. If your site changes, you should rewrite your sitemap to reflect updates to pages if at all possible including priorities, and update frequency. Google is not too concerned about smaller sitemaps where the site can be spidered. This may be why it has not read your sitemap files. It may not need to.

Content freshness and sitemap freshness are important. As well, the sitemap should add value to the spider such as pages that are not directly linked, queuing larger site pages, understanding content freshness, etc. Otherwise the sitemap will be all but ignored.

The best way to get Google to spider your site is to have fresh content. I assume that is the reason why you created the sitemap. If you do not need the sitemap, I would remove it. If you can add value like I mentioned above, then I would keep it. Either way, content freshness is the key to getting Google excited to spider your site. But you sitemap should be fresh too.

Best of Luck!

10% popularity Vote Up Vote Down


 

@Martha676

Note:


I have crawl rate set to MAX in GWT.


By setting the crawl rate, you actually do not force Google to crawl at maximum rate, but you are limiting the crawl rate, by setting an upper limit on how many requests Google bot can perform each second.

If you don't have problems with server usage, you should let Google bot determine the crawl rate, which is primarily calculated based on your pagerank and how many requests the Google bot can perform on your webserver each second

There is no one-click way to force Google bot to crawl and index more pages. If Google bot does not crawl your website enough, and is not indexing your pages, you should consider checking you website for errors and duplicate content and also to improve your internal and external linking.

10% popularity Vote Up Vote Down


 

@Kevin317

You can't. XML sitemaps are for you tell search engines what pages you would like to have indexed. But it doesn't guarantee search engines will crawl or index those pages. In fact, there is no way to force search engines to index anything. They have their own criteria for what gets indexed and when and we cannot change that.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme