: Robots.txt disallow all except one directory and access to sitemap robots.txt containts following instructions: User-agent: * Allow: /public/ Disallow: / Sitemap: http://sitedomain.com/sitemapindex.xml
robots.txt containts following instructions:
User-agent: *
Allow: /public/
Disallow: /
Sitemap: sitedomain.com/sitemapindex.xml
it seems that in this case sitemapindex won't be accessible for the search crawler and should be moved to public folder, is it correct or crawler will process the sitemap in any case?
More posts by @Cofer257
1 Comments
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.