Mobile app version of vmapp.org
Login or Join
Ravi8258870

: Restrict all robots except Googlebot and Yandexbot I want to allow Googlebot access all my pages with crawl delay. Yandexbot access only index.html. Bingbot access /tools pages. All other bots

@Ravi8258870

Posted in: #RobotsTxt #WebCrawlers

I want to allow Googlebot access all my pages with crawl delay. Yandexbot access only index.html. Bingbot access /tools pages.
All other bots will not be accessed my pages.

Is this robots.txt suitable for this?

User-agent: Googlebot
Crawl-delay: 300

User-agent: Yandex
Allow: /index.html
Disallow: /

User-agent: bingbot
Allow: /tools
Disallow: /

User-agent: *
Disallow: /

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Ravi8258870

2 Comments

Sorted by latest first Latest Oldest Best

 

@Radia820

I would recommend to add sitemap directive and Host (for Google and Yandex) in file robots.txt.
Host allows websites with multiple mirrors to specify their preferred domain.

Sitemap: www.example/sitemap.xml Host: example.com

10% popularity Vote Up Vote Down


 

@BetL925

I would use this code in your case:

User-agent: Googlebot
Crawl-delay: 300
Disallow:

User-agent: Yandex
Allow: /index.html
Disallow: /

User-agent: bingbot
Allow: /tools
Disallow: /

User-agent: *
Disallow: /


Even if Crawl-delay: 300 directive is not a standard, just add the Disallow: directive for Googlebot and your code is fine.

Just for your information, to give only access to a page or a directory, you need to place Allow directive before Disallow: /.

More information on Wikipedia.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme