: To allow crawling of all but a specific folder, do I need to include an empty disallow directive in robots.txt? If I want my website to be crawled, do I need an "empty" disallow? Is there
If I want my website to be crawled, do I need an "empty" disallow?
Is there any difference between
User-agent: *
Disallow:
Disallow: /folder/
and
User-agent: *
Disallow: /folder/
I have seen a robots.txt where the first option is used but I don´t understand why.
More posts by @Michele947
1 Comments
Sorted by latest first Latest Oldest Best
An empty disallow matches nothing and is ignored. The second example is what you're looking for.
To allow all robots complete access
User-agent: *
Disallow:
(or just create an empty "/robots.txt" file, or don't use one at all)
www.robotstxt.org/robotstxt.html
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.