: Robots.txt exclude certain urls and include others I have a robots text file which needs to mass exclude certain urls, currently setup as so Disallow: /somestring Disallow: /*/somestring However,
I have a robots text file which needs to mass exclude certain urls, currently setup as so
Disallow: /somestring
Disallow: /*/somestring
However, some of my URLs generated by users are coming back like:
/somestring-something-else
I would like to Allow: this type of URL. In other words, if there are no other characters after a pattern match like /somestring or /*/somestring exclude, if there are other characters, include.
Is there a way around this, or a robots directive appropriate in this index?
More posts by @Ravi8258870
1 Comments
Sorted by latest first Latest Oldest Best
if there are no other characters after a pattern match like /somestring or /*/somestring exclude, if there are other characters, include.
You can use $ to designate the end of the URL.
Disallow: /somestring$
Disallow: /*/somestring$
If there are other characters in the URL after somestring, then they will not match and will therefore be allowed by default.
As with the "wildcard" *, this is not part of the original robots.txt standard, but is supported by all the major search engines.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.