Mobile app version of vmapp.org
Login or Join
Samaraweera270

: "Restricted by Robots.txt" in Google Webmaster Tools when no robots.txt exists In Google Webmaster Tools, I have 832 errors in the sitemap with the error being "URL restricted by robots.txt".

@Samaraweera270

Posted in: #Google #Googlebot #GoogleSearchConsole #RobotsTxt

In Google Webmaster Tools, I have 832 errors in the sitemap with the error being "URL restricted by robots.txt". I don't have a robots.txt file, and the robots meta tag equals "index, follow".

I did have a robots.txt file, and the date it existed could well correspond with the date of the crawl errors in Google Webmaster Tools.

However, the dashboard shows these crawl errors remain, "updated Sept 23, 2011" (yesterday). All of the crawl errors are about 3 weeks old, which is probably before I deleted robots.txt.

Is the fact the crawl errors remain as of yesterday indicating Google has not recrawled the site?

Google has tried to download /robots.txt again, as recently as 4 hours ago, and has not found it, and will not of for weeks.

Do I need to ping Google to crawl the site again?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Samaraweera270

1 Comments

Sorted by latest first Latest Oldest Best

 

@Welton855

You might consider putting up a more permissive robots.txt so that Googlebot finds something to download and work with. It will wait a while if robots.txt returns a 404 error, in case that was unintentional on your part.

Try:

User-Agent: *
Disallow:

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme