: Is it possible to block search engine indexing using DNS alone? As per the title, is it possible to block all search engine indexing using DNS? Most guides point towards robots.txt or meta
As per the title, is it possible to block all search engine indexing using DNS? Most guides point towards robots.txt or meta headers but I'd like to investigate the possibility of blocking all indexing for a domain that will host all of our QA sites (which we don't want appearing on search engines).
More posts by @Harper822
2 Comments
Sorted by latest first Latest Oldest Best
No, you can't block indexing using DNS - unless you made your site inaccessible! If there was a DNS method (like some kind of txt record or something) then the search engine would need to make a secondary lookup to find it.
If you need to block an entire domain, then look at the X-Robots-Tag HTTP response header (HTTP equivalent of the robots meta tag). Although most people will just block crawling with robots.txt (although note that this doesn't necessarily prevent indexing if the blocked resource gets linked to).
No. A DNS is a simple mechanism, you give it a domain, it returns an A-record (with the targetserver's IP address). There isn't really anything you can do.
Even if you manage to create a check in the DNS record, or add a meta tag with robots info, or create a robots.txt, they don't have to listen to it. They're simply being polite and respect the robots.txt, but there is no technical obligation for them to do so.
If they change the HTTP_USER_AGENT or a X-ROBOTS request-header info your check will no longer work. Or when you have an IP check and they switch to a new/unknown IP addrress, your check no longer works.
This is why I prefer to see these are guides for crawlers, not as rules.
Always assume that everything a visitor can see, can been seen by a crawler, one way or another. From a security perspective, this is safer.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.