Mobile app version of vmapp.org
Login or Join
Cooney921

: Securing images from browsers and humans Particular parts of a website I'm working on will have pictures not intended for the general public. It is already behind a password protection and the

@Cooney921

Posted in: #Images #RobotsTxt #Security #WebCrawlers

Particular parts of a website I'm working on will have pictures not intended for the general public. It is already behind a password protection and the page has no-index no-follow rules. I'm concerned about images though.

One site said to enable

robots.txt


User-Agent: *
Disallow: /private-images


Then if someone goes to www.companywebsite.com/robots.txt they would very easily be able to see where we're storing said private-images. However, they should get a 403 Forbidden error if they try to access that directory. However, I want to be as sure as possible that this is the best way.

Is there any other things to consider and implement to protect unwanted eyes from accessing images on a website? Trying to protect trade secrets so only registered dealers can see the information.

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Cooney921

3 Comments

Sorted by latest first Latest Oldest Best

 

@Murray432

As you already know, authentication is the way to solve this problem. It is not clear to me, if you currently only protect the pages showing the images, or if you protect the images themself. Protecting the images can be done in several ways:


It is possible to put the images into a separate directory and use basic access authentication to protect the whole directory (.htaccess). Your application could set the necessary headers, while requests with only the URL won't get the images. Note that this is only secure together with SSL connections.
Instead of using URL's to the real location of the images, you could work with URL's interpreted by your application. When your application e.g. gets a request like www.example.com/img/12abdi341ldkfjoi, it could fetch an image from a directory which is not visible to the web /root/hidden/image1.jpg, and then load and return this image as the result of the request. Of course you can handle the authentication the same way as you do with your html pages.


What i would not rely on is, to hope that URLs to the images are not leaked, because only protected pages contain those URLs. The loaded pages will be accessible in the cache of every user and users may copy paste URLs to your images.

10% popularity Vote Up Vote Down


 

@Ann8826881

If your password protection system works correctly (i.e., if it is secure), so users can’t access the images (even if they enter the images’ URLs directly), search engine bots won’t abe able to access the images either. Bots don’t have special powers, they are regular users in that regard.

If you want to ask bots not to try to crawl URLs of the restricted areas, you can use robots.txt. If you don’t want to disclose the actual URL, you can specify only its beginning, e.g.:

User-Agent: *
Disallow: /priv


(This would disallow crawling of all URLs whose paths starts with /priv, e.g., /private, /private/, /priv.html, /private-images, etc.).

10% popularity Vote Up Vote Down


 

@Sent6035632

Trying to protect trade secrets...


Displaying those "trade secrets" is just wrong. By advertising parts of files in robots.txt that you don't want people to try to access is an extra opportunity for hackers to try to gain access to your system.

As Simon noted, you need to use authentication. This normally means requiring a valid username and password from authorized guests to access the secret elements.

If that's too much for your guests, then a captcha may work, but I don't recommend it because hackers could catch on to it.

If the items are true trade secrets, you could get hefty with security and require users to check their email for a unique code when they login successfully as an extra step to prove they are real.

As for the unauthorized users, returning an error 403 is fine for the moment. As for the authorized guests and for maximum security, you should make a special URL that takes users to a login page where they can validate themselves.

What most sites do which is somewhat less secure is redirect users to the login page instead of giving error 403 when they try to access restricted content. I say this is somewhat less secure because robots can also be redirected to the login page which then gives them an opportunity to hack the system. By giving the special login URL to only a handful of people, you're lowering the chances of a robot attacking the server.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme