Mobile app version of vmapp.org
Login or Join
Harper822

: How long does it take to re-index the website via noarchive tag? I read elsewhere on here that one can't re-index their site with Google, but I like to challenge that. Currently, I have every

@Harper822

Posted in: #Cache #Google #GoogleIndex #Indexing #Noarchive

I read elsewhere on here that one can't re-index their site with Google, but I like to challenge that.

Currently, I have every page on my website (that contains about 1,000,000 pages) labeled with a noarchive robots meta tag so that Google can stop caching old webpage content.

Once all the old content is no longer cached I will remove the noarchive robots tag so that Google can cache the NEW improved pages.

I'm just curious of how long this process will usually take. In webmaster tools, I set it to allow Google to max setting which is to crawl up to two pages per second.

At the moment of the 1,000,000 pages, about 175,000 are indexed and it would be great if I can get Google's caches to be updated with the new pages.

Can someone give me an idea how long the process takes?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Harper822

1 Comments

Sorted by latest first Latest Oldest Best

 

@LarsenBagley505

Firstly there is no need to use the noarchive robots tag in your situation. You state it is to remove all of your old content from the Google cache and then you remove it to have the cache save your new content. It doesn't exactly work this way as each time Google detects a new version of your page the cached copy stored with Google will be updated to reflect the new version of your page.

As for how long this takes it very much depends. Google can take a short time or a long time to re-index your site, and more often than not Google won't re-index your whole site at once, rather they will re-index different parts of it at different rates depending on what the re-indexing algorithms deem necessary.

On top of setting the index frequency in webmaster tools to a high level, the only other solution would be to have all of the pages listed in a sitemap.xml file, then each time you update a page you update the lastmod declaration in the sitemap url record. You can also set through the sitemap how frequently the page in question is expected to be updated which Google does use in evaluating how frequently to re-index your site. Once the changes have been made to your sitemap.xml file you re-upload the sitemap.xml file to Google Webmaster Tools and Google will then process the sitemap and add the necessary pages to the crawl queue. Doesn't necessarily speed up the process per-say but it is a faster process than waiting for natural linking to trigger a re-index of your page.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme