Mobile app version of vmapp.org
Login or Join
Kevin317

: Can I invoke Google to check my robots.txt? I read the answers in this question, but they still leave my question open: Does Google cache robots.txt? I didn't find a way in the Google Webmaster

@Kevin317

Posted in: #Google #GoogleSearchConsole #RobotsTxt

I read the answers in this question, but they still leave my question open: Does Google cache robots.txt?

I didn't find a way in the Google Webmaster Tools to invoke a re-download of my robots.txt.

Through some error, my robots.txt was replaced with:

User-agent: *
Disallow: /


And now all my content was removed from Google search results.

Obviously, I'm interested in correcting this as soon as possible. I already replaced the robots.txt, but I can't find a way to make Google update the cached version.

10.04% popularity Vote Up Vote Down


Login to follow query

More posts by @Kevin317

4 Comments

Sorted by latest first Latest Oldest Best

 

@Moriarity557

I hope this link will help you crawl your websites: support.google.com/adsense/answer/10532?hl=en.
Remove / from your robots.txt file.

10% popularity Vote Up Vote Down


 

@RJPawlick198

I faced the same problem when I started my new website satyabrata.com on June 16.

I had a Disallow: / in my robots.txt, exactly like Oliver. There was also a warning message in Google Webmaster Tools about blocked URLs.

The problem was solved yesterday, June 18. I did the following. I am not sure which step worked.


Health -> Fetch as Google: robots.txt and the home page. Then, submit to index.
Settings -> Preffered domain: Display URL as satyabrata.com Optimization -> Sitemaps: Added XML sitemap.


The warning message about blocked URLs is gone now and a fresh robots.txt is shown downloaded in Google Webmaster Tools.

Presently, I have only two pages indexed in Google, the home page and robots.txt. I have 10 pages on the website. I hope the rest will get indexed soon.

10% popularity Vote Up Vote Down


 

@Si4351233

I had a problem where the images were moved to a separate CNAME server and a disallow was put on the images folder. The way I got it to clear was to have robots.txt retrieved in the Webmaster Tools read webpage as Google tool. Once it told me it had retrieved and read robots.txt, I submitted it. This broke a three month embargo on scanning images where Google reported it was reading the robots.txt file but wasn't changing its spidering to match the rules that were changed to allow the image folder. Within a week, images were being indexed again.

Might be worth a try. Google is known to occasionally get stuck and fail to reread the file.

10% popularity Vote Up Vote Down


 

@Pope3001725

You can't make them re-download your robots.txt when you want them to. Google will re-crawl it and use the new data whenever they feel it is appropriate for your site. They tend to crawl it regularly so I wouldn't expect it to take long for your updated file to be found and your pages re-crawled and re-indexed. Keep in mind that it may take some time after the new robots.txt file is found before your pages are re-crawled and even more time for them to reappear in Google's search results.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme