: Restrict all robots except Googlebot and Yandexbot I want to allow Googlebot access all my pages with crawl delay. Yandexbot access only index.html. Bingbot access /tools pages. All other bots
I want to allow Googlebot access all my pages with crawl delay. Yandexbot access only index.html. Bingbot access /tools pages.
All other bots will not be accessed my pages.
Is this robots.txt suitable for this?
User-agent: Googlebot
Crawl-delay: 300
User-agent: Yandex
Allow: /index.html
Disallow: /
User-agent: bingbot
Allow: /tools
Disallow: /
User-agent: *
Disallow: /
More posts by @Ravi8258870
2 Comments
Sorted by latest first Latest Oldest Best
I would recommend to add sitemap directive and Host (for Google and Yandex) in file robots.txt.
Host allows websites with multiple mirrors to specify their preferred domain.
Sitemap: www.example/sitemap.xml Host: example.com
I would use this code in your case:
User-agent: Googlebot
Crawl-delay: 300
Disallow:
User-agent: Yandex
Allow: /index.html
Disallow: /
User-agent: bingbot
Allow: /tools
Disallow: /
User-agent: *
Disallow: /
Even if Crawl-delay: 300 directive is not a standard, just add the Disallow: directive for Googlebot and your code is fine.
Just for your information, to give only access to a page or a directory, you need to place Allow directive before Disallow: /.
More information on Wikipedia.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.