Mobile app version of vmapp.org
Login or Join
Sue5673885

: Effect of incomplete Disallow rule in robots.txt file Solved: Pages were being blocked by meta robots deliberately A lot of pages are being blocked in the robots.txt file and when I checked

@Sue5673885

Posted in: #RobotsTxt

Solved: Pages were being blocked by meta robots deliberately

A lot of pages are being blocked in the robots.txt file and when I checked the file there were no indications of a rule that blocks these.

The robots.txt file is structured:

Sitemap: domain.com/sitemap.xml User-agent: *
Disallow: /directory-1/
Disallow: /directory-2/
Disallow: /directory-3/
Disallow: /directory-4/
Disallow: /directory-5/

User-agent: Googlebot-Image
Disallow:


None of the directories in the robots file match the URLs that are being blocked.

I was wondering if the incomplete Disallow could be the issue, despite it being only applicable to Googlebot-Image?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Sue5673885

1 Comments

Sorted by latest first Latest Oldest Best

 

@Connie744

I have just found that an incomplete Disallow allows all robots complete access.

Found the answer here: www.robotstxt.org/robotstxt.html

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme