: Explicitly whitelist every page using robots.txt I need to allow only specific pages at some site. Let's say: / /page1 /page2 ... Now I don't get how to whitelist the root page at robots.txt
I need to allow only specific pages at some site. Let's say:
/
/page1
/page2
...
Now I don't get how to whitelist the root page at robots.txt
Disallow: / # block everything
Allow: /page1
Allow: /page2
Allow: ? # how to allow the / ???
And I'm stuck here, because Allow: / will allow everything back
Is it even possible?
More posts by @Megan663
1 Comments
Sorted by latest first Latest Oldest Best
You should be able to use a $ sign which indicates the end of the string.
For example
Allow: /$
If you go to the bottom of Google's robots.txt help, you can see a similar example.
developers.google.com/webmasters/control-crawl-index/docs/robots_txt
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.