Mobile app version of vmapp.org
Login or Join
Voss4911412

: How to de-index pages from google using robots.txt I added the no-index tag for the google-bot, but it is kind of taking a long time to de-index some pages since I have over 100k pages I

@Voss4911412

Posted in: #Seo

I added the no-index tag for the google-bot, but it is kind of taking a long time to de-index some pages since I have over 100k pages I need to de-index lol :)))

What is the proper way to de-index pages using robots.txt?

Thanks!

Any idea how long should it take?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Voss4911412

2 Comments

Sorted by latest first Latest Oldest Best

 

@Jamie184

Also check URL removal explained, Part I: URLs & directories

As that article suggests, you can remove directories (and whole sites) in Google's Webmaster Tools. This might be quicker than robots.txt initially, although you will need to use robots.txt as well.

Keep the noindex meta tag to avoid your page appearing at all in Google's index. A robots.txt only solution could still allow URL only links from appearing in the SERPS.

Using robots.txt to remove the "http://www.example.com/getridofthis/" directory...

User-agent: *
Disallow: /getridofthis/

10% popularity Vote Up Vote Down


 

@Odierno851

Take a look at robotstxt.org has all the info you need.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme