Mobile app version of vmapp.org
Login or Join
Hamaas447

: Need to encourage google to refresh robots.txt My old robots.txt is User-agent: * Disallow: / Is blocking me for uploading a new sitemap and also blocking me for fetching robots.txt manually.

@Hamaas447

Posted in: #RobotsTxt #Seo #Sitemap #Xml

My old robots.txt is

User-agent: *
Disallow: /


Is blocking me for uploading a new sitemap and also blocking me for fetching robots.txt manually. I don't know what to do.

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Hamaas447

2 Comments

Sorted by latest first Latest Oldest Best

 

@Harper822

robots.txt is cached and will automatically be refreshed by google within a day, so passage of time will sort this.


Caching A robots.txt request is generally cached for up to one day,
but may be cached longer in situations where refreshing the cached
version is not possible (for example, due to timeouts or 5xx errors).
The cached response may be shared by different crawlers. Google may
increase or decrease the cache lifetime based on max-age Cache-Control
HTTP headers.

developers.google.com/webmasters/control-crawl-index/docs/robots_txt?hl=it

10% popularity Vote Up Vote Down


 

@Eichhorn148

Googlebot will re-fetch robots.txt more often than most other files on your web server. You generally have to wait less than 24 hours. From Google's Documentation:


A robots.txt request is generally cached for up to one day, but may be cached longer in situations where refreshing the cached version is not possible (for example, due to timeouts or 5xx errors). The cached response may be shared by different crawlers. Google may increase or decrease the cache lifetime based on max-age Cache-Control HTTP headers.


I'm not sure what you mean when you say that your old robots.txt file is blocking you from fetching the new one manually. Robots don't obey robots.txt for the purpose of fetching robots.txt. Robots periodically fetch the robots.txt file, even if you were to put a Disallow: /robots.txt line in the file. There is no way to use robots.txt to prevent bots from checking robots.txt.

One way to force Googlebot to download a page immediately is to use the "Fetch as Google" feature in Google Webmaster Tools (its in the "Crawl" menu). You could use this feature to force Googlebot to fetch your new robots.txt file right away.

Webmaster Tools also has a "Blocked URLs" feature (also in the "Crawl" menu) that shows you what your current robots.txt file is and lets you test to see which URLs are blocked by it. You can change the robots.txt file in that tool to make sure that changes block and unblock the URLs that you expect.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme