: Hide a Subdomain from Google/Search Engine SEO Result? I have a subomain which I do not want to be listed in any search engine result. Let say I have: http://www.example.com http://child.example.com
I have a subomain which I do not want to be listed in any search engine result. Let say I have:
www.example.com http://child.example.com
How can I hide all URLs of child.example.com domain which are currently showing in the search engine results?
More posts by @Carla537
4 Comments
Sorted by latest first Latest Oldest Best
...currently showing in the SEO results?
The other answers are more about proactively preventing the indexing of a (sub)domain (which is primarily what you are asking in the question) rather than actively removing your pages from the search results, which might be more what you are after, judging by your other question.
You still need to block your site in robots.txt and serve a noindex meta tag (or X-Robots-Tag HTTP response header), as stated in the other answers, but you also need to block access to your pages, returning a 404 or 410 status code.
You can read more about this on the Google Webmaster Tools help page:
Requirements for removing content
Once you have all these in place then you can use the Remove URLs tool in Google Webmaster Tools. However, this only applies to individual URLs, not an entire site, but it's a start. (Google states that using robots.txt, noindex and serving a 404 are the requirements in order to use the GWT removal tool.)
However, if you still want regular users to be able to access the site by typing the URL, then it's a problem - as your content is no longer available. You could password it, as Zistoloen suggests, however this will return a 403 (Forbidden) by default which you would need to override to return a 404/410. You could cloak your content, returning a 410 to the Googlebot and allowing everyone else to enter - but what about people clicking on results in the SERPs?
But if you want Google to remove your content in the fastest possible time then you need to remove it from the "Google" internet.
Solutions from Kenzo and Paul are good, you can put meta tags noindex on your web pages and add robots.txt to disallow robots.
But in my opinion, the best solution is to use password authentification on your sub domain. This is the only solution you're sure robots can access and index your web site. If you use Apache, you can implement htpasswd.
Using a robots.txt file in your subdomain will help (and Google will obey this), but another step you can take is to specify with a Google Webmasters account that you don't want this subdomain to be indexed. You can also use a meta tag on all pages in the subdomain:
<meta name="robots" content="noindex">
If this happens to be a site that you are only using for internal testing, limiting visibility to a specifies set of IP addresses in your virtual hosts file would further hide the site.
In the root of the directory of the subdomain website, add a file called robots.txt containing:
User-agent: *
Disallow: /
This will tell web crawlers not to index the site at all. They do not have to obey, but the main ones will.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.