Mobile app version of vmapp.org
Login or Join
Murray155

: How to remove old robots.txt from google as old file block the whole site I have a website which still shows old robots.txt in the google webmaster tools. User-agent: * Disallow: / Which

@Murray155

Posted in: #GoogleSearchConsole #RobotsTxt #SearchEngines

I have a website which still shows old robots.txt in the google webmaster tools.

User-agent: *
Disallow: /


Which is blocking Googlebot. I have removed old file updated new robots.txt file with almost full access & uploaded it yesterday but it is still showing me the old version of robots.txt Latest updated copy contents are below

User-agent: *
Disallow: /flipbook/
Disallow: /SliderImage/
Disallow: /UserControls/
Disallow: /Scripts/
Disallow: /PDF/
Disallow: /dropdown/


I submitted request to remove this file using Google webmaster tools but my request was denied

I would appreciate if someone can tell me how i can clear it from the google cache and make google read the latest version of robots.txt file.

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Murray155

3 Comments

Sorted by latest first Latest Oldest Best

 

@Gretchen104

The first thing to do is to ensure that you correctly uploaded the new robots.txt file.

Visit yourwebsite.com/robots.txt from your browser to check it.

10% popularity Vote Up Vote Down


 

@Si4351233

This is from Google Webmaster Developers site developers.google.com/webmasters/control-crawl-index/docs/faq

How long will it take for changes in my robots.txt file to affect my
search results?

First, the cache of the robots.txt file must be refreshed (we
generally cache the contents for up to one day). Even after finding
the change, crawling and indexing is a complicated process that can
sometimes take quite some time for individual URLs, so it's impossible
to give an exact timeline. Also, keep in mind that even if your
robots.txt file is disallowing access to a URL, that URL may remain
visible in search results despite that fact that we can't crawl it. If
you wish to expedite removal of the pages you've blocked from Google,
please submit a removal request via Google Webmaster Tools.


And here are specifications for robots.txt from Google developers.google.com/webmasters/control-crawl-index/docs/robots_txt
If your file's syntax is correct the best answer is just wait till Google updates your new robots file.

10% popularity Vote Up Vote Down


 

@Murray432

Usually Google checks regularly for changes to your robots.txt file.

If you sign up for a Google Webmaster Tools account you can find out more information, including the last time Google checked your robots.txt file.

EDIT: You can't request that your robots.txt file be removed either. The URL removal tool is for removing pages from the search index.

Have you confirmed that the new file has been uploaded correctly? There have been times I have tried to FTP a file to the server, but when I checked online the changes weren't showing up. When I logged in to my hosting control panel, and viewed my files, it was still the old file. Apparently the file wasn't transferred properly by my FTP client, although it wasn't displaying any errors.

So the first thing to do is confirm that your file was uploaded correctly bu going to the robots.txt file in your browser, and seeing what is there.

If the file was uploaded correctly, and you can see the new file in the browser, then your file is being cached somewhere else. Some CDN's cache static files, so this could be the cause if you use a CDN.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme