Mobile app version of vmapp.org
Login or Join
Si4351233

: Keeping files private on the internet (.htaccess password or software/php/wordpress password) I was asked a while ago to setup a server such that only authenticated users can access files. It

@Si4351233

Posted in: #Privacy #SearchEngines #Security

I was asked a while ago to setup a server such that only authenticated users can access files. It was like a test server for clients to view WIP sites. More recently, I want to do something similar for some of my files. Tho they are not very confidential, I wish that I am the only one viewing it.

I thought of doing the same,


Create a robots.txt

User-agent: *
Disallow: /

Setup some password protection, .htpasswd seems like a very ugly way to do it. It will prompt me even when I log into FTP. I wonder if software method like password protected posts in Wordpress will do the trick of locking out the public and hiding content from Search Engines? Or some self made PHP script will do the trick?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Si4351233

2 Comments

Sorted by latest first Latest Oldest Best

 

@Angie530

Take a look at mod_access - from your description of what you need, an IP whitelist should be sufficient.

10% popularity Vote Up Vote Down


 

@Kristi941

The robots.txt method won't offer you any real security as not only do bots not have to honor it, but it's still possible that content gets indexed anyway which has been shown to happen. Plus if someone knows where the content is robots.txt does nothing to stop someone from accessing that content.

Basic Authentication, which is what .htpasswd is, will keep bots and anyone you don't want in, out. But it is clunky and if you access the content from a public computer you better make sure you close the browser or else anyone can access it without logging back in.

Password protected posts like in Wordpress require you creating a login and authentication system of your own. That is very flexible but obviously requires more work on your end to set up.

Basic authentication will do exactly what you need and fast. If you're the only person trying to access the files it probably is the best way to go. But if you ever plan on expanding who can access that content you want to make a basic login system for yourself and there are lots of tutorials available for doing it (at least in PHP). Definitely use the robots.txt file and I would also place a .htaccess file that uses the x-robots-tag to also block bots as it is more effective:

Header set X-Robots-Tag: noindex

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme