: Is this a correct robots.txt file? I would like to allow Googlebot and Mediapartners-Google (AdSense useragent) to crawl my website. So I have written below code inside my robots.txt file. User-agent:
I would like to allow Googlebot and Mediapartners-Google (AdSense useragent) to crawl my website. So I have written below code inside my robots.txt file.
User-agent: Googlebot
Disallow:
User-agent: Mediapartners-Google
Disallow:
Sitemap: website.com/sitemap.xml
Is the above robots.txt file is correctly written? Yes or no ?
More posts by @Shelton105
2 Comments
Sorted by latest first Latest Oldest Best
No, that is not correct. The default behaviour is to allow everything.
Use this instead:
User-agent: Googlebot
Disallow: /
If you want to allow everything to all User agents, try this:
User-agent: *
Disallow: /
Robots.txt permits to you to block some web crawlers (not allowing), if you want to allow a specific crawler like Googlebot, just ignore it in your robots.txt (same for Mediapartners-Google).
Just take a look at robots.txt of Pro Webmasters for example.
If you don't want to block these two web crawlers and you don't want to focus on other web crawlers, your robots.txt must be like this:
User-agent: *
Allow: /
Sitemap: example.com/sitemap.xml
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.