: Hosting sitemaps in S3 and submitting them to Google Web Master Tools In robots.txt: # robotstxt.org User-agent: * sitemap: https://s3.amazonaws.com/my-bucket/sitemaps/sitemap.xml sitemap: https://s3.amazonaws.com/my-bucket/sitemaps/video
In robots.txt:
# robotstxt.org
User-agent: *
sitemap: s3.amazonaws.com/my-bucket/sitemaps/sitemap.xml sitemap: s3.amazonaws.com/my-bucket/sitemaps/videositemap.xml sitemap: s3.amazonaws.com/my-bucket/sitemaps/imagesitemap.xml sitemap: s3.amazonaws.com/my-bucket/sitemaps/newssitemap.xml
Is it "enough"? GWT does not complain, but I can't submit sitemaps manually to GWT because it enforces my domain for sitemaps (it allows me do add/test sitemaps with URL starting from www.example.com/). Do I need to submit sitemaps to GWT if they are referenced in robots.txt?
More posts by @Ogunnowo487
2 Comments
Sorted by latest first Latest Oldest Best
No it is not enough.
The Sitemap specification says the location of a sitemap file determines the set of URLs that can be included in that sitemap.
A Sitemap file located at domain/catalog/sitemap.xml can include any URLs starting with domain/catalog/ but can not include URLs starting with domain/images/.
My experience with Google Search Console is you need to submit the sitemaps to it manually rather than relying on the robots.txt reference, to get the most benefit (eg, resubmitting for a reindex).
but I can't submit sitemaps manually to GWT because it enforces my domain for sitemaps
Put up forwarding from your domain to the sitemaps, like a ProxyPass in Apache. That way, you can add it to GWT and see what happens.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.