Mobile app version of vmapp.org
Login or Join
Phylliss660

: Multiple sites, same markup, different content, tier linkage = SEO penalty? I've developed a site, and I'd like to apply the exact same theme with the same HTML markup but different CSS styling

@Phylliss660

Posted in: #Backlinks #Content #Google #Html #Seo

I've developed a site, and I'd like to apply the exact same theme with the same HTML markup but different CSS styling to multiple different sites on different domains. The sites will be related in genre, but they will have completely different content.

I plan use a tiered linkage approach, where that one of these sites is at the bottom layer (most important), and all the other sites link (one way) to this site. Kind of like this:



Will using the same theme AND linking these same-genre sites together likely result in an SEO penalty from Google?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Phylliss660

2 Comments

Sorted by latest first Latest Oldest Best

 

@Annie201

There's one thing that's still true... Content is KING.

You want to make sure the content you create is unique and you want to vary the markup at least by a little bit.

Here's an idea that will help you discover why it's a no-no to make near-duplicates of pages:

Create an account with Google webmaster tools if you haven't already done so and add verify every single domain and subdomain you are using as part of your tiered scheme. As google starts checking your pages, if content is mostly duplicate, google will indicate under the HTML improvements section of the affected website that there are duplicate title and/or meta description tags.

Another thing you want to do is google search a keyword density tool and put your sites in it and you'll see what keyword has the highest score. This can help determine the plot of each page.

Also, there are duplicate content checkers on the net where you can test any two pages to find out how duplicate they are to each other.

Also, if each page has a tiny amount of user readable content, then the chances of duplication will be sky-high because HTML code counts as content which can be used in the duplication factor.

For example.

This code:

<!DOCTYPE HTML>
<html>
<head>
<title>ABC</title>
</head>
<body>
ABC
</body>
</html>


Is about 82 bytes of which 6 bytes is user readable content. If you were to follow your idea and create a page like this:

<!DOCTYPE HTML>
<html>
<head>
<title>EFGH</title>
</head>
<body>
EFGH
</body>
</html>


Its 84 bytes of which 8 bytes are user readable content.

Mathematically, in the first case, 7% of the content is visible and 93% is code. In the second case, 9.5% of the content is visible and 90.5% is code.

To help prevent penalties, always go unique naturally like the above respondent stated. change your HTML up a bit, and add more rich text.

Also, If google thinks pages are too similar to each other, then it will likely only index one of the pages it believes is authentic or best matches the user's query.

10% popularity Vote Up Vote Down


 

@Nimeshi995

If you are a company and you have several sites for different product lines, then linking between sites in a natural way is your right to do and Google recognizes this.

However, if you have several sites that are largely unrelated in that branding and product lines are not linked, then you may run into trouble. Keep in mind that branding signals play a large part in this. I count 46 branding signals, though not all of them will be present, at least half will be. Keep this in mind.

Here is a bit of what happens.

Google will look for relationships between sites including but not limited to registration information (including historical), names, addresses, phone numbers, e-mail addresses, link patterns, social media profiles, and so on. Google is very good at putting domains into realms by relationships. They will use a large number of data points in evaluating linkages/relationships between sites.

Another thing Google is very good at is looking for linking schemes between sites and will make determinations automatically and manually as to whether inter-site links are natural or un-natural.

Even if you do not have many similarities that you think Google can form into a realm, I warn you that your templating, css, JavaScript, tags, and literal language can also give you away. Google does look to language usage to identify authorship between sites. They also use this to find relationships between sites.

Google hates linking schemes and if they determine that you are using a linking scheme that is devised to build rank whether it actually does or not, they will slap you silly and recovery will be long and hard. There will be nothing you can do about it either. Keep in mind that linking schemes was one of the biggest factors taken advantage of by spammers and their linking schemes were rather sophisticated- far more than the one you are proposing.

If you are creating natural links- links that any webmaster would make, then you are okay. But step beyond this, and it will simply be a matter of time and all sites will suffer a severe loss for a very long time. You would have severely downgraded your trust score with a history of spamming that is never forgotten. Trust spans all domains you may want to operate. If you have one spam site, Google is automatically wary of all sites they determine are within your realm.

You have to ask yourself this important question. Am I linking naturally? It appears that you are building a linking scheme that is not natural but intentional. I would seriously advise against this. I would, however, advise linking naturally and think marketing and not search engine influence. Think content as a product for humans and not machines.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme