: How to handle SEO for pages with similar textual content? I have 3 pages with similar text content. For example /hardwell/top-songs/ /hardwell/latest-songs/ /hardwell/similar-djs/ The main text content
I have 3 pages with similar text content.
For example
/hardwell/top-songs/
/hardwell/latest-songs/
/hardwell/similar-djs/
The main text content of the page is the biography on the top. It is the same for all the 3 pages. But the main content of the page is different, even though it is not text-heavy.
Would google penalize me for this? How should one handle the SEO in such a case?
In general how should SEO be dealt with in cases where the website is not text-heavy, eg. a music discovery website?
More posts by @Fox8124981
2 Comments
Sorted by latest first Latest Oldest Best
The main text content of the page is the biography on the top. It is the same for all the 3 pages. But the main content of the page is different, even though it is not text-heavy.
Search engine crawlers at this time don't use image file contents as a factor for content and because of this, your pages (based on two I have checked) are at least 75% duplicate, if not more. You might want to search for a duplicate content checker on the net and use it to make sure your pages aren't too duplicate.
One idea to fix your problem is to write more about the topic contained in the URL. For example, for /hardwell/similar-djs/, write more about the similar DJ's in a few paragraphs and in /hardwell/latest-songs/, write about the songs in detail.
You can also follow closetnoc's suggestion of using rel="canonical" if you can't write unique content, but that may cause fewer pages to be indexed in search engines.
Another idea is to merge two pages into one if the amount of textual content ends up being too little on each page.
Copying the exact same set of paragraphs in the same order, and pasting it from page to page is a bad idea.
I took a look at your page.
If only the lists change, then these will likely be seen as duplicate. If there is enough overlap in the lists, this can get you into dangerous areas. Even if the lists are not identical, it is possible that they will weigh less than the regular content on the page and still may be seen as duplicate. This is because at one point, spammers used lists extensively to make pages appear different. It is always better to be safe than sorry. You can always link to the other pages and make each page stand out for searches such as "latest songs".
Google does not compare pages in a linear fashion anymore. It uses semantic scores and compares pages using these scores. The reason for this is simple. For a long time, spammers just reorganized the content to escape duplicate content issues and put countless useless pages into the SERPs. This is easily defeated by looking at how pages score using semantics. Any set of pages that are too similar will be seen as duplicate.
If this is the case, then you should just pick one and use a canonical tag on all the similar pages to point to the one you chose.
This page tells you more than you need to know about canonical tags, but worth checking out: support.google.com/webmasters/answer/139066?hl=en
Here is an example from that page:
<link rel="canonical" href="https://blog.example.com/dresses/green-dresses-are-awesome" />
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.