: Google Webmaster Tools claiming my robots.txt blocking almost all of my site I have submitted a sitemap which has many thousands of URLs, but when I look at the webmaster tools it claims that
I have submitted a sitemap which has many thousands of URLs, but when I look at the webmaster tools it claims that 9800 of my URLS are blocked by my robots.txt file.
What am I supposed to do to convince it that nothing is being blocked?
More posts by @Rivera981
1 Comments
Sorted by latest first Latest Oldest Best
First of all check your robots.txt and add below mentioned commands
User-agent: *
Sitemap : yourdoamin.com/sitemap.xml
After adding this resubmit the robots file to website as well as google webmaster tool.
Try this and let me know if you have any query regarding this. For more details visit : www.robotstxt.org/robotstxt.html
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.