Mobile app version of vmapp.org
Login or Join
Smith883

: Why can't Google Bot access my site? I repeatedly get an warning/error message from Google as follows: Googlebot can't access your site I have checked, edited, removed, replaced the robots.txt

@Smith883

Posted in: #Googlebot

I repeatedly get an warning/error message from Google as follows:


Googlebot can't access your site


I have checked, edited, removed, replaced the robots.txt file on the site all to no avail.

Here is the content of the file:

User-agent: *
Allow: /


Am I doing something wrong here? What is the solution to make that bot happy?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Smith883

2 Comments

Sorted by latest first Latest Oldest Best

 

@Nickens628

I also want to add that you check your server configuration to make sure you're not giving special treatment to Google's block of IP addresses. Make sure your server firewall isn't configured to block Google's IP addresses.

Also, make sure your .htaccess as well as other configuration files do not contain entries that direct the system to give a bad response to Google's IP addresses.

Another possibility is that your server configuration is set so that only certain browsers or devices are allowed access to the website and Google isn't one of them.

When you make changes to the server configuration, use the "fetch as google" tool in Google webmaster tools to make sure Google can access your content. Also, run it through other Google tools such as page-speed insights.

Another thing that can help is for you to check all access logs, especially right after using the "fetch as google" tool to see what IP address and user agent string google uses so that you can redirect the correct bot to the actual content instead of to an error page.

10% popularity Vote Up Vote Down


 

@Speyer207

Its hard to say what the possible reason is but heres a drill down what can cause this issue:


Robots.txt blocking access to GoogleBot.
Use of noindex and nofollow can cause issues.
Server is caching an old version of robots.txt check your .htaccess for TXT MIME and expire entries.
DNS propagation is not fully complete, this process normally takes anywhere between 1-72 hours. Some DNS propagation checkers can help, but ultimately... waiting is the solution to this issue.
301 Redirect loop, sometimes rules in the .htaccess file can cause redirect loops, while users may be able to use the site, Google may reject it. Use CURL on your site, you should see a HTTP/1.1 200 OK in the header response.
Routing Errors, sometimes DNS issues occur on networks , while some people may not be able to access your site, some may... this apply to Google as well. This normally resolves itself.


Other issues include firewalls or misconfiguration servers. 99% of the time is due to DNS issues which will resolve assuming that you have setup the DNS on your domain correctly.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme