: How to block everything from being indexed except sitemap.xml I want to block everything and index sitemap.xml file alone. So I do it as shown below: User-agent: * Disallow: / Allow: /sitemap.xml
I want to block everything and index sitemap.xml file alone. So I do it as shown below:
User-agent: *
Disallow: /
Allow: /sitemap.xml
Sitemap: example.net/sitemap.xml
However, I am not sure whether this is correct or allowed.
More posts by @Angela700
1 Comments
Sorted by latest first Latest Oldest Best
You can remove these three lines from robots.txt:
User-agent: *
Disallow: /
Allow: /sitemap.xml
and give search engines the freedom to access all public facing pages so that the odds of your useful content being in their indexes are much higher.
If you leave your setup the way it is, then search engines will only attempt to access to sitemap.xml (which doesn't render as an actual webpage users can use), and all other content which includes images on your server won't be looked at by search engines that respect robots.txt
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.