: Google Search Console error with path to sitemap.xml I have site-shop and the platform created a sitemap.xml automatically. But in Google Search Console there is an error because robots.txt like
I have site-shop and the platform created a sitemap.xml automatically. But in Google Search Console there is an error because robots.txt like this:
User-agent: *
Disallow: /my/
Disallow: /cart/
Disallow: /checkout/
Sitemap: sumki5.ru/sitemap-shop.xml
Crawl-delay: 5
User-agent: Yandex
Disallow: /my/
Disallow: /cart/
Disallow: /checkout/
Crawl-delay: 5
Sitemap: sumki5.ru/sitemap-shop.xml
Host: sumki5.ru
Is this normal? Can it be a problem for SEO?
More posts by @Harper822
1 Comments
Sorted by latest first Latest Oldest Best
The sitemap URL has to be absolute. In your case it is missing the . Change the line to:
Sitemap: sumki5.ru/sitemap-shop.xml
And Google will stop complaining in search console.
If you don't fix the problem, Google will not be able to access your sitemap. That isn't necessarily a disaster. Sitemap files are not needed for good SEO. In fact, they really don't help at all. What they do is give you additional insight into your site in Google Search Console. See: The Sitemap Paradox
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.