: SEO multiple ecommerce website running off the same codebase I was wondering the about the effect a duplicate/similar codebase of products would have on a sites SEO across 190 domains. We have
I was wondering the about the effect a duplicate/similar codebase of products would have on a sites SEO across 190 domains.
We have a dealer group with 190 for a certain class of products e.g. hardware & building supplies in the same country. As part of thier membership to the dealer group our members recieve printed catalogues of 2500 - 7000 products, they also get a web solution with all the catalogue products listed which allows b2b ordering, searching and quote requests on these items. We are looking into making these sites more SEO friendly (readable url's, responsive, keywords etc) nothing serious, pretty much just make sure we check all the basics. All the websites point to the same codebase, which then fetches the members template and product/category visibility and displays the site.
My questions are as follows...
Would there be any negative ranking effect based on the repeated listings of products from different domains served from the same ip? Thousands of products have identical short and long descriptions,keywords,codes and images.
What would be the best way to implement and xml sitemap for each different domain? I have considered generating a sitemap for each domain in a directory i.e /allsitemaps with a disallow bots file and then adding a route to the site that points to the correct sitemap depending on the url so sample.com/sitemap.xml actually returns sample.com/allsitemaps/sample.com/sitemap.xml. Do you think this approach would work?
Sorry if the questions seem basic. I have just enough SEO knowledge to get a unique site indexed reasonably well, but I have no idea how search engines react to a situation like this and am struggling to find any information on the topic.
Thanks in advance.
More posts by @Correia994
1 Comments
Sorted by latest first Latest Oldest Best
In short, yes. One of the biggest issues in SEO is duplication of content (both externally and internally). If 190 domains all have the exact same products listed that will create lots of duplicate content, and it will be difficult for the sites to rank.
However there are ways around this, after all amazon is full of content found elsewheree on the web. Such as adding other unique content to the site, using iframes to 'hide' the duplicated content.
Some good tips on this here:
Handling User-Generated & Manufacturer-Required Duplicate Content Across Large Numbers of URLs
You can list sitemap.xml files for different domains on a single domain, you just need to verify all domains in the same Search Console acccount.
More info here: Manage sitemaps for multiple sites
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.