: Does repetition of same rules for each major crawler bot makes sense in robots.txt? SEO is a bit of arcane topic and sometimes I encounter strange practics which seemingly aren't backed up by
SEO is a bit of arcane topic and sometimes I encounter strange practics which seemingly aren't backed up by any advice but still persists.
Now I mean something like this:
User-agent: Googlebot
Disallow: /page/
Disallow: /ajax
Disallow: *?back*
User-agent: Yandex
Disallow: /page/
Disallow: /ajax
Disallow: *?back*
...
User-agent: *
Disallow: /page/
Disallow: /ajax
Disallow: *?back*
Does this makes some sense? Those rules are really totally same.
More posts by @Heady270
1 Comments
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.