Mobile app version of vmapp.org
Login or Join
Shelley277

: SEO & Reordering of duplicate content I am currently working on a website, and the designer wants a menu that is setup like this: Toplevel1 Toplevel2 submenu1 submenu2 submenu3 Toplevel3 Toplevel4

@Shelley277

Posted in: #DuplicateContent #Seo #Url

I am currently working on a website, and the designer wants a menu that is setup like this:


Toplevel1
Toplevel2

submenu1
submenu2
submenu3

Toplevel3
Toplevel4


All menu items have a title and content. When the user clicks toplevel2, the page needs to show all titles and content of the child submenus like so:

Title for submenu1 content....

Title for submenu2 content....

Title for submenu3 content....

etc..

If the user instead clicks directly on submenu2, the exact same content should be displayed but with the selected submenu title and content placed first, like this:

Title for submenu2
content....

Title for submenu1
content....

Title for submenu3
content....

etc..

Although I could get into a general discussion as to weather or not this is good design, my actual question is: Does this hurt SEO? They are all gonna have different URLS:
domain.com/submenu1 domain.com/submenu2

etc..

The thing is, I can't just do canonical redirect, since the order of the content still differs between url's.

(I posted this question on stack overflow first, but was asked to post it here instead.)

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Shelley277

2 Comments

Sorted by latest first Latest Oldest Best

 

@Candy875

Does this hurt SEO? They are all gonna have different URLS


The short answer is yes, especially if there are many menus and sub-menus. Since all your pages will have a different URL, but nearly same content, they will be considered as near duplicate content (that is low value/quality for the end user).

The solution is simple, pick one URL and use rel="canonical" in all other pages with this URL and you will be safe.

10% popularity Vote Up Vote Down


 

@Jamie184

While Google can be quite forgiving, the Google Scholar experience and citations allows Google to recognize this scenario natively. This means that content is not read/understood in a linear fashion any more (since 2008) and that similar chunks of data are easily compared between pages and sites. While any negative effect might be argued, I do know that pages, paragraphs, sentences, and even phrases can be compared to easily recognize reordered content. After-all this was a trick used by content spammers years ago.

So I would expect that this would be considered duplicate content. Depending upon the site size you will likely experience an effect at the least. Smaller sites are not often not penalized as a target, but an algorithm will probably effect your ability to reach searchers.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme