: Can we use regex in robots.txt file to block URLs? I have a few dynamic generated URLs. Can I use regex to block these URLs in a robots.txt file?
I have a few dynamic generated URLs.
Can I use regex to block these URLs in a robots.txt file?
More posts by @Heady270
1 Comments
Sorted by latest first Latest Oldest Best
Regular Expressions are not valid in robots.txt, but Google, Bing and some other bots do recognise some pattern matching.
Say if you wanted to block all URLs that have a example any where in the URL, you can use a wild card entry *
User-agent: *
Disallow: /*example
You can also use the dollar sign $ to specify that the URLs must end that way. So if you wanted to block all URLs that end with example, but not URLs that had aexample elsewhere in the URL you could use:
User-agent: *
Disallow: /*example$
More in-depth info for Google can be found here: Robots.txt Specifications, Bing here: How to Create a Robots.txt file and there is a an interactive guide on Moz here
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.