Mobile app version of vmapp.org
Login or Join
Kevin317

: There's actually a third way to prevent Google and other search engines from indexing URLs. It's the X-Robots-Tag HTTP Response Header. This is better then meta tags because it works for all

@Kevin317

There's actually a third way to prevent Google and other search engines from indexing URLs. It's the X-Robots-Tag HTTP Response Header. This is better then meta tags because it works for all documents and you can have more then one tag.


The REP META tags give you useful control over how each webpage on your site is indexed. But it only works for HTML pages. How can you control access to other types of documents, such as Adobe PDF files, video and audio files and other types? Well, now the same flexibility for specifying per-URL tags is available for all other files type.

We've extended our support for META tags so they can now be associated with any file. Simply add any supported META tag to a new X-Robots-Tag directive in the HTTP Header used to serve the file. Here are some illustrative examples:
Don't display a cache link or snippet for this item in the Google search results:
X-Robots-Tag: noarchive, nosnippet
Don't include this document in the Google search results:
X-Robots-Tag: noindex
Tell us that a document will be unavailable after 7th July 2007, 4:30pm GMT:
X-Robots-Tag: unavailable_after: 7 Jul 2007 16:30:00 GMT

You can combine multiple directives in the same document. For example:
Do not show a cached link for this document, and remove it from the index after 23rd July 2007, 3pm PST:
X-Robots-Tag: noarchive
X-Robots-Tag: unavailable_after: 23 Jul 2007 15:00:00 PST

10% popularity Vote Up Vote Down


Login to follow query

More posts by @Kevin317

0 Comments

Sorted by latest first Latest Oldest Best

Back to top | Use Dark Theme