: Robots.txt and specific file inclusion Is it possible to use robots.txt to disallow crawling of a folder, but allow crawling a specific file in that folder?
Is it possible to use robots.txt to disallow crawling of a folder, but allow crawling a specific file in that folder?
More posts by @Murray432
2 Comments
Sorted by latest first Latest Oldest Best
@Alex answer is partially correct. I subsequently discovered this:
en.wikipedia.org/wiki/Robots_exclusion_standard#Allow_directive
At the very bottom of the page.
To exclude all files except one
This is currently a bit awkward, as there is no "Allow" field.
The easy way is to put all files to be disallowed into a separate directory,
say "stuff", and leave the one file in the level above this directory.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.