Mobile app version of vmapp.org
Login or Join
Pope3001725

: After fixing problems with duplicate content, will my search engine rankings return? Background: I have been making a new website design at www.example.com/new which has been uploaded for some

@Pope3001725

Posted in: #DuplicateContent #Google #Seo #Serps #WebsiteDesign

Background: I have been making a new website design at example.com/new which has been uploaded for some time as I have been redesigning the site. This site shares most of the same content as my current site at example.com.

Since I did not link to any of the pages on the new site, I thought Google would not index it, but it appears that it did. This would explain the fact that I have been experiencing a drop in SEO for my site.

Now that I have realized the problem, I will be replacing all the files in the /new directory with the duplicate files (i.e., replacing /new/gyms.html with /gyms.html) which should eliminate the duplicate content.

My question is, now that I will have fixed the problem of having duplicate content, will this SEO penalty go away? Will my rankings go back to the way they were? Will there always be some lingering effects?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Pope3001725

1 Comments

Sorted by latest first Latest Oldest Best

 

@Welton855

The drop in visibility in search is almost certainly unrelated to the parallel website.

Almost all websites have content hosted on multiple URLs within the same website. That's something which search engines have to deal with. There's no reason to penalize a website for having that & certainly at Google there's no duplicate content penalty when it comes to your own content.

The effects you'd see (with content duplication within a website) are:


Google's algorithms will choose one URL to show for the content in search. Maybe it won't choose the URL you'd choose. If you have a preference, make it known (through redirects, rel=canonical, internal links, etc).
Depending on the amount of duplication (is each piece of content hosted 2x, 20x or 200x?), it can happen that the process of crawling is too much for the server, or that new/updated content isn't picked up as quickly as it otherwise might be.


With a "reasonable" amount of duplication (in your case, just 2x?) and with a reasonably strong server, neither of these are real problems. Most users won't notice the choice of URL, and crawling can still be sufficient.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme