Mobile app version of vmapp.org
Login or Join
Kevin317

: Why is Google saying that the User-Agent line is invalid? I have the following robots.txt file: User-agent: * Sitemap: https://m.mysite.com.au/sitemap.xml and when I use the Webmaster Tools to

@Kevin317

Posted in: #GoogleSearchConsole #RobotsTxt

I have the following robots.txt file:

User-agent: *
Sitemap: m.mysite.com.au/sitemap.xml


and when I use the Webmaster Tools to validate it I get it showing that the user agent line is incorrect.

However when I validate it, it comes back as "Allowed".

In both cases it is the same file. So my questions are:


What is wrong with the User-Agent line?
I am being asked to remove the User-Agent line. While I understand that * means all agents I don't understand why it has to be removed.
Any ideas why Google is giving inconsistent results?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Kevin317

1 Comments

Sorted by latest first Latest Oldest Best

 

@Sherry384

This is because...

User-agent: *
Sitemap: m.mysite.com.au/sitemap.xml

...would be incorrect.

The correct syntax for the robots.txt is:

To disallow all:

User-agent: *
Disallow: /


To allow all:

User-agent: *
Disallow:


To disallow specific directories:

User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/


Examples taken from here: www.robotstxt.org/robotstxt.html
The correct syntax for the site map is:

Sitemap: m.mysite.com.au/sitemap.xml

You cannot restrict sitemaps using the robots.txt file. You should place the sitemap last though I am not sure that is a requirement.

If you wish to allow all and specify a sitemap, it would be:

User-agent: *
Disallow:

Sitemap: m.mysite.com.au/sitemap.xml

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme