Mobile app version of vmapp.org
Login or Join
Harper822

: How can a site contain duplicate content that's required, but not affect its SEO? I am making a site for a specific niche. This site is very similar: Sample Article 01: https://alternativeto.net/software/google-chrome/

@Harper822

Posted in: #DuplicateContent #Google #Seo

I am making a site for a specific niche. This site is very similar:


Sample Article 01: alternativeto.net/software/google-chrome/ Sample Article 02: alternativeto.net/software/firefox/

If you check above both URLs you'll find that both pages are very much the same... At least they have around 80% duplicate content.

My pages also will be the same. As a nature of the site, this happens automatically.

So what can I do? I mean I don't want to lower its rank because of duplicate content.

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Harper822

3 Comments

Sorted by latest first Latest Oldest Best

 

@Jessie594

There is no doubt that Duplicate content has bad effects over rankings. But as you said that both pages having same topic content so you should use some different words and sentence to remove duplicate content.

For Example - if there is "Pieces of the butter"
then on 2nd page write butter pieces...
I'm giving you just a hint...!

it will not be duplicated content technically. you can write a sentence in many ways.....

Almost every site has some pages where the topics are almost same.
How to rank in google?
On page SEO
Off page SEO guide
How to do SEO...

almost same terms will be used in every above topic. neilpatel.com is the best example of this.....

10% popularity Vote Up Vote Down


 

@Steve110

Your above the fold content is not duplicate. It looks like you've structured your site to show Google that one page is about Chrome and another page is about Firefox. As a result, Google should know which page to send its users when they search for Chrome or Firefox.

As long as you've cleary identified to Google what the page is about using keyword density for the keyword "Chrome" or "FireFox", and have different title and meta description tags, I think this should be fine.

It is normal for sites to sometimes have similar content on pages for similar products. Your below the fold related browser software isn't quite targeting the keyword of the browser name. It is just offering further content rich material for the visitor on the page.

I don't think you should expect any duplication penalties for this.

If you want to increase the chances that these pages rank, you may want to enhance the original content of them as much as you can. Google rewards pages for having many words on them. Over 1500 words seems best.

10% popularity Vote Up Vote Down


 

@Kevin317

The problem with these sorts of cases is that it can be very difficult to convince search engines that all of these pages need to exist in the index, because technically they are right about the duplicate content factor. (Around 80% is similar enough, as you've mentioned above.)

What's more, your header and footer also play a role in this; I once had to fix a site where pages (that had to exist for legal reasons) had the same header and footer, but different content (two or three HTML paragraphs each) -- and most or these were seen as duplicate content.

The solution is to find creative ways to add useful and original content until you hit the point where Google or Bing no longer see it as duplicate and put it back in the index. (Software like Moz's suite of tools can help to flag potential issues early.)

Make sure that, for every potential problem page:


Your URL's are unique and distinct.
Your H1's are unique.
Your title and description tags are unique and descriptive.
Ensure that the canonical tags are present on every page and point to their own unique URL.
Create original structured data markup for each page and include as JSON-LD files.
Create original Open Graph data markup for each page as well.
Most importantly, add content: this can include additional paragraphs of copy for your main page subjects (such as the browser at the top of each page in your example URL's), or distinct lists of helpful links on each page, or unique asides, or even comments. Also consider adding images with unique alt tags.


After you do all of this, resubmit your XML Sitemap, or even just the individual pages you just updated, in Google Search Console to be crawled. If the pages are crawled but you still cannot find them in the index, keep adding more content and resubmitting, until it works and the ratio of duplicate content drops to acceptable levels. It'll take creativity, and trial and error. Good luck!

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme