Mobile app version of vmapp.org
Login or Join
Cugini213

: Cascading matches in robots.txt I was wondering if it's possible to let bots match two groups of User-Agent: blocks. I have this block of robots.txt User-Agent: AhrefsBot # https://ahrefs.com/robot/

@Cugini213

Posted in: #RobotsTxt #SearchEngines

I was wondering if it's possible to let bots match two groups of User-Agent: blocks.

I have this block of robots.txt

User-Agent: AhrefsBot # ahrefs.com/robot/ Crawl-Delay: 5

User-agent: *
Disallow: /error.php
Disallow: /cron.php
... and many more rules ...


Now, I wonder if AhrefsBot will now respect the crawl-delay and also respect the rules in user-agent:*?
If not, then I'd have to duplicate everything under user-agent:* which would be quite unhandy.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Cugini213

1 Comments

Sorted by latest first Latest Oldest Best

 

@Martha676

Once an User-Agent is mentioned generally it only reads the rules mentioned below it. Most possibly AhrefsBot will not respect rules in user-agent:* and you need to duplicate the rules. Here is example from twitter and facebook robots.txt to support my opinion. They have duplicated rules for all bot specifically mentioned. twitter.com/robots.txt https://www.facebook.com/robots.txt

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme