: Effect of incomplete Disallow rule in robots.txt file Solved: Pages were being blocked by meta robots deliberately A lot of pages are being blocked in the robots.txt file and when I checked
Solved: Pages were being blocked by meta robots deliberately
A lot of pages are being blocked in the robots.txt file and when I checked the file there were no indications of a rule that blocks these.
The robots.txt file is structured:
Sitemap: domain.com/sitemap.xml User-agent: *
Disallow: /directory-1/
Disallow: /directory-2/
Disallow: /directory-3/
Disallow: /directory-4/
Disallow: /directory-5/
User-agent: Googlebot-Image
Disallow:
None of the directories in the robots file match the URLs that are being blocked.
I was wondering if the incomplete Disallow could be the issue, despite it being only applicable to Googlebot-Image?
More posts by @Sue5673885
1 Comments
Sorted by latest first Latest Oldest Best
I have just found that an incomplete Disallow allows all robots complete access.
Found the answer here: www.robotstxt.org/robotstxt.html
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2025 All Rights reserved.