: Cascading matches in robots.txt I was wondering if it's possible to let bots match two groups of User-Agent: blocks. I have this block of robots.txt User-Agent: AhrefsBot # https://ahrefs.com/robot/
I was wondering if it's possible to let bots match two groups of User-Agent: blocks.
I have this block of robots.txt
User-Agent: AhrefsBot # ahrefs.com/robot/ Crawl-Delay: 5
User-agent: *
Disallow: /error.php
Disallow: /cron.php
... and many more rules ...
Now, I wonder if AhrefsBot will now respect the crawl-delay and also respect the rules in user-agent:*?
If not, then I'd have to duplicate everything under user-agent:* which would be quite unhandy.
More posts by @Cugini213
1 Comments
Sorted by latest first Latest Oldest Best
Once an User-Agent is mentioned generally it only reads the rules mentioned below it. Most possibly AhrefsBot will not respect rules in user-agent:* and you need to duplicate the rules. Here is example from twitter and facebook robots.txt to support my opinion. They have duplicated rules for all bot specifically mentioned. twitter.com/robots.txt https://www.facebook.com/robots.txt
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.