Mobile app version of vmapp.org
Login or Join
Gonzalez347

: Google Webmaster Tools is showing incorrect warnings - blocked by robots.txt I'm getting a warning when submitting my sitemap to webmaster tools it saying that my pages are blocked by robots.txt

@Gonzalez347

Posted in: #GoogleSearchConsole #RobotsTxt

I'm getting a warning when submitting my sitemap to webmaster tools it saying that my pages are blocked by robots.txt when they are not. Has any one come across this or a way to resolve this before ?

Here is the error message:



Here is my robots.txt file:

User-agent: *
Disallow:
Sitemap: mydomain.co.uk/sitemap.xml

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Gonzalez347

1 Comments

Sorted by latest first Latest Oldest Best

 

@Annie201

Some times that warning means there is a meta noindex tag on your site not just robots.txt blocking the robots. Go to your website and view the source code in a private browser session. Are you using any CMS such as WordPress or another?

Also if you've made changes to allow robots it can take hours or up to 48 hours for Google to recognize the changes and successfully crawl and access your sitemap. You can go to Health on the sidebar in Google webmaster tools and fetch as Googlebot it'll probably give you the same error no matter what page you try and access.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme