: Google Webmaster Tools is showing incorrect warnings - blocked by robots.txt I'm getting a warning when submitting my sitemap to webmaster tools it saying that my pages are blocked by robots.txt
I'm getting a warning when submitting my sitemap to webmaster tools it saying that my pages are blocked by robots.txt when they are not. Has any one come across this or a way to resolve this before ?
Here is the error message:
Here is my robots.txt file:
User-agent: *
Disallow:
Sitemap: mydomain.co.uk/sitemap.xml
More posts by @Gonzalez347
1 Comments
Sorted by latest first Latest Oldest Best
Some times that warning means there is a meta noindex tag on your site not just robots.txt blocking the robots. Go to your website and view the source code in a private browser session. Are you using any CMS such as WordPress or another?
Also if you've made changes to allow robots it can take hours or up to 48 hours for Google to recognize the changes and successfully crawl and access your sitemap. You can go to Health on the sidebar in Google webmaster tools and fetch as Googlebot it'll probably give you the same error no matter what page you try and access.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.