Mobile app version of vmapp.org
Login or Join
Michele947

: To allow crawling of all but a specific folder, do I need to include an empty disallow directive in robots.txt? If I want my website to be crawled, do I need an "empty" disallow? Is there

@Michele947

Posted in: #RobotsTxt

If I want my website to be crawled, do I need an "empty" disallow?

Is there any difference between

User-agent: *
Disallow:
Disallow: /folder/


and

User-agent: *
Disallow: /folder/


I have seen a robots.txt where the first option is used but I don´t understand why.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Michele947

1 Comments

Sorted by latest first Latest Oldest Best

 

@Si4351233

An empty disallow matches nothing and is ignored. The second example is what you're looking for.


To allow all robots complete access

User-agent: *
Disallow:


(or just create an empty "/robots.txt" file, or don't use one at all)

www.robotstxt.org/robotstxt.html

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme