: Robots.txt and pattern matching Adding this to my robots.txt User-agent: * Disallow: /*action=*$ How does robots not recognizing wild cards handle this?
Adding this to my robots.txt
User-agent: *
Disallow: /*action=*$
How does robots not recognizing wild cards handle this?
More posts by @Bryan171
1 Comments
Sorted by latest first Latest Oldest Best
Robots that do not recognize wildcards (which is not in the official spec) will treat * as a literal character. The fact that it is not a valid URL character may mean that they ignore the rule altogether. In either case, it likely means that the rule will have no effect on them.
This will depend a bit on the exact implementation of the crawlers robot.txt honoring scheme and can not be entirely counted on.
If you want to avoid this you could have a separate configuration for googlebot (and others who do honor robots.txt.
E.g.
User-agent: *
Disallow: /
User-Agent: Googlebot
Disallow: /*action=*$
Which bans all robots except Googlebot which will honor the wildcard configuration.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2025 All Rights reserved.