Mobile app version of vmapp.org
Login or Join
Turnbaugh106

: First, Google Bot can detect if pages were not updated and not crawl them in the first place (this should not reduce the amount of indexed pages). He can detect that in multiple ways, for

@Turnbaugh106

First, Google Bot can detect if pages were not updated and not crawl them in the first place (this should not reduce the amount of indexed pages). He can detect that in multiple ways, for example using timestamps or etags.

After crawling, there is no promise that it will get indexed - it may, or it may not. There are quite a lot of factors that determine if Google will index the pages the bot crawled.

From time to time Google will also purge their index and remove pages that they don't think should be there. Things like content-duplication, junk/spam content and other factors are used to decide this. No one (except Google) actually knows in advance when these kind of purges happen, or what factors are used. But they do happen from time to time.

Also, read the content under the title "Indexing Stuff" at the link for even more information, www.google.com/support/forum/p/Webmasters/thread?tid=2ad71287c04eb280

10% popularity Vote Up Vote Down


Login to follow query

More posts by @Turnbaugh106

0 Comments

Sorted by latest first Latest Oldest Best

Back to top | Use Dark Theme