Mobile app version of vmapp.org
Login or Join
Murray432

: Robots.txt and specific file inclusion Is it possible to use robots.txt to disallow crawling of a folder, but allow crawling a specific file in that folder?

@Murray432

Posted in: #RobotsTxt #Seo

Is it possible to use robots.txt to disallow crawling of a folder, but allow crawling a specific file in that folder?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Murray432

2 Comments

Sorted by latest first Latest Oldest Best

 

@Murray432

@Alex answer is partially correct. I subsequently discovered this:
en.wikipedia.org/wiki/Robots_exclusion_standard#Allow_directive

10% popularity Vote Up Vote Down


 

@Speyer207

At the very bottom of the page.


To exclude all files except one

This is currently a bit awkward, as there is no "Allow" field.
The easy way is to put all files to be disallowed into a separate directory,
say "stuff", and leave the one file in the level above this directory.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme