Mobile app version of vmapp.org
Login or Join
Angela700

: How to block everything from being indexed except sitemap.xml I want to block everything and index sitemap.xml file alone. So I do it as shown below: User-agent: * Disallow: / Allow: /sitemap.xml

@Angela700

Posted in: #Indexing #RobotsTxt #WebCrawlers

I want to block everything and index sitemap.xml file alone. So I do it as shown below:

User-agent: *
Disallow: /
Allow: /sitemap.xml
Sitemap: example.net/sitemap.xml

However, I am not sure whether this is correct or allowed.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Angela700

1 Comments

Sorted by latest first Latest Oldest Best

 

@LarsenBagley505

You can remove these three lines from robots.txt:

User-agent: *
Disallow: /
Allow: /sitemap.xml


and give search engines the freedom to access all public facing pages so that the odds of your useful content being in their indexes are much higher.

If you leave your setup the way it is, then search engines will only attempt to access to sitemap.xml (which doesn't render as an actual webpage users can use), and all other content which includes images on your server won't be looked at by search engines that respect robots.txt

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme