Mobile app version of vmapp.org
Login or Join
Kevin317

: There are multiple ways to do this (combining them is obviously a sure way to accomplish this): 1) Use robots.txt to block the files from search engines crawlers: User-agent: * Disallow: /pdfs/

@Kevin317

There are multiple ways to do this (combining them is obviously a sure way to accomplish this):

1) Use robots.txt to block the files from search engines crawlers:

User-agent: *
Disallow: /pdfs/ # Block the /pdfs/directory.
Disallow: *.pdf # Block pdf files. Non-standard but works for major search engines.


2) Use rel="nofollow" on links to those PDFs

<a href="something.pdf" rel="nofollow">Download PDF</a>


3) Use the x-robots-tag: noindex HTTP header to prevent crawlers from indexing them. Place this code in your .htaccess file:

<FilesMatch ".pdf$">
header set x-robots-tag: noindex
</FilesMatch>

10% popularity Vote Up Vote Down


Login to follow query

More posts by @Kevin317

0 Comments

Sorted by latest first Latest Oldest Best

Back to top | Use Dark Theme