: Rules for robots.txt file to exclude a whole subdomain from google's search results In my website's root folder i already have a robots.txt file with some rules for the domain name www.example.com.
In my website's root folder i already have a robots.txt file with some rules for the domain name example.com.
My website is multilingual so i have setup with drupal de.example.com, fr.example.com & es.example.com
I want to exclude the subdomain es.example.com from google's search results.
Because there is only one root, will the following rules work in the only one robots.txt?
User-agent: *
Disallow: es.example.com/
EDIT:
I have updated the robots.txt and submitted to Google's webmasters tools (robots.txt Tester).
It didn't gave me any errors or warnings, so i will wait for the results!
More posts by @Deb1703797
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.