: What is the correct way to write my "robots.txt" file? I have written the following code inside my robots.txt file: User-Agent: Googlebot Disallow: User-agent: Mediapartners-Google
I have written the following code inside my robots.txt file:
User-Agent: Googlebot
Disallow:
User-agent: Mediapartners-Google
Disallow:
Sitemap: example.com/sitemap.xml
Is my robots.txt is correct? I only want two user agent to follow my site (i.e., Googlebot & Mediapartners-Google).
More posts by @Gonzalez347
2 Comments
Sorted by latest first Latest Oldest Best
Almost, see the bottom of the section named, "Blocking user-agents" in the following: Google Webmaster Tools: Block or remove pages using a robots.txt file
According that, you should have:
User-agent: *
Disallow: /
User-agent: Googlebot
Allow: /
User-agent: Mediapartners-Google
Allow: /
sitemap: example.com/sitemap.xml
See the bottom of the page in the above link for how to test your robots.txt file.
Nearly, you need to Disallow all the other bots first though.
The wild card (*) below means all bots.
User-agent: *
Disallow: /
User-Agent: Googlebot
Disallow:
User-agent: Mediapartners-Google
Disallow:
Sitemap: example.com/sitemap.xml
Although please note, not all web crawlers will obey robots.txt.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.