Mobile app version of vmapp.org
Login or Join
Heady270

: To allow all crawling you have some options. The clearest and most widely support is: User-agent: * Disallow: To paraphrase, it means, "All user agents have nothing disallowed, they can

@Heady270

To allow all crawling you have some options. The clearest and most widely support is:

User-agent: *
Disallow:


To paraphrase, it means, "All user agents have nothing disallowed, they can crawl everything." This is the version of "allow all crawling" that is listed on robotstxt.org.



Another option is to have no robots.txt file. When robots encounter a 404 error at /robots.txt they assume that crawling is not restricted.



I would not recommend using Allow: directives in robots.txt. Not all crawlers support them. When you have both Allow: and Disallow: directives, the longest matching rule takes precedence instead of the first or last matching rule. This drastically complicates the process. If you do use use "Allow", be sure to test your robots.txt file with a testing tool such as the one from Google.

10% popularity Vote Up Vote Down


Login to follow query

More posts by @Heady270

0 Comments

Sorted by latest first Latest Oldest Best

Back to top | Use Dark Theme