Mobile app version of vmapp.org
Login or Join
Samaraweera270

: Use robots.txt to prevent privacy policy, terms and conditions, and guarantees from being crawled and indexed by Google I need to block pages such as privacy policy, terms and conditions, and

@Samaraweera270

Posted in: #Html #MetaRobots #Php #RobotsTxt #UrlEncoding

I need to block pages such as privacy policy, terms and conditions, and guarantees from being crawled and indexed by Google. They might be unoriginal. I use them for my multiple websites and they have no content relevant to be ranked by search engines.

Their links are like example.com/privacy.php
I don't want to mess up and end up being penalized by Google

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Samaraweera270

1 Comments

Sorted by latest first Latest Oldest Best

 

@Harper822

Just go to your C panel>>public_html>>site directory and create a file named "robots.txt" in it paste the following code

User-Agent: *
Disallow: /privacy.php/
Disallow: /terms.php/
Allow:
Sitemap: www.xxxxxxx.xxx/sitemap.xml


Change the links above to match up with your site.
With the above, google bots can never crawl any information in the "Disallow" field.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme