Mobile app version of vmapp.org
Login or Join
LarsenBagley505

: Google Webmaster Tools complains about missing robots.txt Nine days ago, I got a message Google Webmaster Tools: Over the last 24 hours, Googlebot encountered 1 errors while attempting to

@LarsenBagley505

Posted in: #Googlebot #GoogleSearchConsole #RobotsTxt

Nine days ago, I got a message Google Webmaster Tools:


Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt.


Well, but I don't have a robots.txt on that site, because robots.txt is optional and I want the whole site to be crawled. So why do I get this error message?

Perhaps of interest: The Google Webmaster tools home page lists realitybuilder.com and realitybuilder.com. I don't know how that happened, but realitybuilder.com redirects to realitybuilder.com, so it should not be necessary to have it listed. I now deleted the entry for realitybuilder.com. Could that have caused the problem?

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @LarsenBagley505

3 Comments

Sorted by latest first Latest Oldest Best

 

@Berryessa370

This error occur when your robots.txt file exists but is unreachable. Your site should return 200 HTTP status if the file exists or 404 if it doesn't, otherwise you would face such message from google.


Before Googlebot crawls your site, it accesses your robots.txt file to
determine if your site is blocking Google from crawling any pages or
URLs. If your robots.txt file exists but is unreachable (in other
words, if it doesn’t return a 200 or 404 HTTP status code), we’ll
postpone our crawl rather than risk crawling URLs that you do not want
crawled. When this happens, Googlebot will return to your site and
crawl it as soon as we can successfully access your robots.txt file.

10% popularity Vote Up Vote Down


 

@Goswami781

If you don’t have a robot.txt file and Googlebot encountered 1 error, then I think you have to define your robots.txt.

Robots.txt is a simple text file. If you want your entire website to be crawled, then you can define your robots.txt file as below.

User-agent: *
Disallow:


In this file User-agent: * means this section applies to all robots and the Disallow: tells the robot that it should visit all the pages on the site.

Upload your robots.txt file with this code and after few days see that Google webmaster tool shows any error or not.

10% popularity Vote Up Vote Down


 

@Shelley277

I'm not sure why webmaster tools does this but I've had a similar problem with my site. When it was in development, I blocked it using the robots.txt file, then removed the block when it went live, but webmaster tools took a while to update itself.

What I'd recommend is to do a fetch as Googlebot and submit all pages, that should get Google looking at your site again quicker.

One last thing, your right robots.txt is optional but it might help the search engines understand better in you make a robots.txt file and set it to:

User-Agent: *
Disallow:


Which is just like saying all pages are allowed.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme