Mobile app version of vmapp.org
Login or Join
Hamaas447

: Network unreachable: robots.txt unreachable I am trying to add a valid sitemap to Google Webmaster. Yet, it says: Network unreachable: robots.txt unreachableWe were unable to crawl

@Hamaas447

Posted in: #GoogleSearchConsole #RobotsTxt #Sitemap

I am trying to add a valid sitemap to Google Webmaster. Yet, it says:


Network unreachable: robots.txt unreachableWe were unable to crawl
your Sitemap because we found a robots.txt file at the root of your
site but were unable to download it. Please ensure that it is
accessible or remove it completely.


and


Network unreachable: robots.txt unreachableWe were unable to crawl
your Sitemap because we found a robots.txt file at the root of your
site but were unable to download it. Please ensure that it is
accessible or remove it completely.


Yet, I can access both my robots.txt and sitemap.xml. I have reading other posts here and there, but could not solve/understand what is causing this issue. Anyone knows?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Hamaas447

2 Comments

Sorted by latest first Latest Oldest Best

 

@Welton855

You probably want to check with your site's hoster to find out more. These messages essentially mean that Googlebot can't reach your site, which is generally something that happens more on the hosting side. We see a lot of "unreachable" failures for the domain overall, for what it's worth, so if they want to have those sites listed in Google's search results, it would probably make sense for them to figure out why these requests are being dropped.

10% popularity Vote Up Vote Down


 

@Samaraweera270

Unfortunately, the fact you can access the files doesn't guarantee that Google can. Are you doing any user-agent detection or similar that may be interfering? Try downloading a page with the Fetch as Googlebot tool, and see if it encounters any problems – if that works, that suggests anything else should, too.

Depending on your site, there may be a few days or more between Google's attempts to access the file. As such, it could be there was an issue then that has disappeared now (like a server outage). So if the Fetch tool doesn't suggest any problems, I'd consider waiting a while to see if the issue recurs before doing more.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme