Mobile app version of vmapp.org
Login or Join
Ravi8258870

: Robots.txt exclude certain urls and include others I have a robots text file which needs to mass exclude certain urls, currently setup as so Disallow: /somestring Disallow: /*/somestring However,

@Ravi8258870

Posted in: #RobotsTxt

I have a robots text file which needs to mass exclude certain urls, currently setup as so

Disallow: /somestring

Disallow: /*/somestring


However, some of my URLs generated by users are coming back like:

/somestring-something-else


I would like to Allow: this type of URL. In other words, if there are no other characters after a pattern match like /somestring or /*/somestring exclude, if there are other characters, include.

Is there a way around this, or a robots directive appropriate in this index?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Ravi8258870

1 Comments

Sorted by latest first Latest Oldest Best

 

@Ann8826881

if there are no other characters after a pattern match like /somestring or /*/somestring exclude, if there are other characters, include.


You can use $ to designate the end of the URL.

Disallow: /somestring$
Disallow: /*/somestring$


If there are other characters in the URL after somestring, then they will not match and will therefore be allowed by default.

As with the "wildcard" *, this is not part of the original robots.txt standard, but is supported by all the major search engines.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme