: Duplicate Sitemap URLs I'm in the process of replacing my old sitemaps with a new form which should better represent our site. However, as bots can take days or even weeks to scan all of
I'm in the process of replacing my old sitemaps with a new form which should better represent our site. However, as bots can take days or even weeks to scan all of our urls after submitted I do not want to adversely affect our site. If I submit the new sitemaps simultaneously so there are duplicates of all urls, will this affect my site negatively? The old sitemaps would be removed after the new ones are scanned.
Is this the correct process for replacing sitemaps, or does anyone have a different recommendation?
More posts by @Cody1181609
2 Comments
Sorted by latest first Latest Oldest Best
Remove the old sitemap and submit new one. For fast indexing, use bulk ping services for for all URLs.
If you want to remove duplicate URLs from current site map, then use Microsoft Excel to this job.
Yeah. That is the wrong approach.
Just replace your sitemap. Keep life Simple!
Honest and legitimate bots won't fully spider your site just because you updated or replaced your sitemap. They are smarter than that. The decision to visit a page for a search engine is based on metrics in their database and not an updated sitemap. I update my sitemap often. Some sites update their sitemaps daily and some every few minutes. It is not a problem. Go nuts and have fun!!
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.