Mobile app version of vmapp.org
Login or Join
Tiffany637

: How long does it take for Google to remove page from index? Or, why aren't these pages excluded? A set of pages are marked as noindex and nofollow, both in robots.txt and with X-Robots-Tag:

@Tiffany637

Posted in: #Googlebot #GoogleSearch #GoogleSearchConsole #RobotsTxt

A set of pages are marked as noindex and nofollow, both in robots.txt and with X-Robots-Tag: noindex, nofollow When checking with Google Webmaster Tools, the pages are reported as "Denied by robots.txt", which is nice. Also, as mentioned in this answer, disallowed pages may still be indexed even if not technically crawled, cause that's how Google rolls.

However, after adding the Robots-Tag two weeks ago, the pages still appears in Google search results.

For example, this test page www.english-attack.com/profile/scott-s-sober is found when searching for its h1 title "Scott S. Sober" www.google.com/search?q=%22Scott+S.+Sober%22

Why is this?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Tiffany637

2 Comments

Sorted by latest first Latest Oldest Best

 

@Heady270

Putting a file into robots.txt will not prevent Google from indexing the page. It will only prevent Googlebot from re-crawling the page. If the page has been previously crawled, Google may find the version of the content it knows about compelling enough to stay in its index for months.

If enough external links point to that page, Google occasionally keeps pages blocked by robots.txt indexed forever. Sometimes it even indexes pages it has never crawled. In such cases it uses only anchor text of inbound links for the keywords of the page, and does not have a cached version of the page.

If you want Google to remove the page from the index you should allow Googlebot to crawl the page and see the "noindex" in the meta robots tag. If you don't allow Googlebot to crawl the page, it will never learn that you don't want it indexed. So take out the Disallow: line for those files from robots.txt.

Alternately, you could use Google Webmaster Tool to request removal for each URL. That can be painful if you have more than a handful of URLs that you wish de-indexed.

10% popularity Vote Up Vote Down


 

@Hamm4606531

The cause of the problem is that Google is not seeing the newly added X-Robots-Tag because it's not re-indexing the page.

Removing the ban from robots.txt and letting Google fetch the pages with their headers does remove the pages from the results.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme