: Confused for writing robots.txt file I am getting DNS error from last few 5 days in my Google Webmaster Tools. That's the reason I would like to confirm my robots.txt file. Is the below robots.txt
I am getting DNS error from last few 5 days in my Google Webmaster Tools. That's the reason I would like to confirm my robots.txt file.
Is the below robots.txt file is correct or not?
User-agent: *
Allow: /
Sitemap: example.com/sitemap.xml
Please help me to solve this doubt.
More posts by @Gonzalez347
2 Comments
Sorted by latest first Latest Oldest Best
If you're getting a DNS error, it's not related to your robots.txt - it's related to your DNS settings. You can check for DNS errors with online tools like: DNS Health
If you need additional help with that, I'd suggest making a screenshot of your DNS table settings from your DNS service provider and asking another question with that screenshot pasted in it (i.e., example).
By the way, the accepted answer regarding your other robots.txt question is correct. I would stick with that for what you were asking about previously - robots.txt directives are not the cause of any DNS errors.
To allow all:
User-agent: *
Disallow:
To disallow all:
User-agent: *
Disallow: /
It's a bit confusing, and unintuitive I know. As a simple way of generating them I tend to use the generator tool.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.