Mobile app version of vmapp.org
Login or Join
Heady270

: Is it best for SEO to have one large domain vs multiple small Building genuine links is tough, if you have only one domain that does everything vs many domains that are specific it will be

@Heady270

Posted in: #MultipleDomains #MultiSubdomains #Seo

Building genuine links is tough, if you have only one domain that does everything vs many domains that are specific it will be easier to build up the one website. In saying that does Google favor a niche website over a giant website?

For example say I want to make a domain that has multiple websites running as subdirectories. This way i can generste more links back to my domain from other sites. I want to have a subsite that covers movies, sports and video games. All 3 are very diverse categories and mostly unrelated.

If I write a review for batman. Will it be more likely to come up in search results if the site is a dedicated movie site? I.e.
everything.com/movies/batman-review movies.com/batman-review
This is one scenario, but what if the sub site was niche related. So if I want to make a movie fan site for batman, superman and several other movies. Would this work better under my movie domain?

I would hope that it doesn't matter as the content of the page itself should be all that matters, but I don't know if Google use the niche of the website as a factor. If the niche is everything then its not really a niche site, but technically it is as I would be using a WordPress multisite that uses subdirectories instead of subdomains

10.04% popularity Vote Up Vote Down


Login to follow query

More posts by @Heady270

4 Comments

Sorted by latest first Latest Oldest Best

 

@Alves908

To begin:

Okay. There are several things to consider here. Looking at things from Google's point of view, you can easily see why Google makes some of the decisions they make. I will cover just a bit of this.

First of all, humans have amazing cognitive abilities far beyond what any computer can do. Even Watson cannot even come close to the intelligence contained within our subconscious minds and programmers can only code what is realized from our conscious minds. This is a very slow process.

So for computers, they can only do what we tell them based upon what we realize or learn. It is not possible that a computer can do what the human mind can do. Computers are highly dependent upon processes we create.

Search engines have come a far way and use well established sciences and techniques to do their work. Much of what is new is how these sciences and techniques are used. Search engineers are becoming very clever these days.

Be that as it may.

There are still some limitations. Google does prefer that a site is about one thing or a few similar things. You and I know that is not fully practical, however, from the search engine perspective, using the technologies available, it is difficult to understand what a site is about if the topics are cars, glasiers and polar ice, and chess for the blind. Search engines want to match search intent. This requires a detailed understanding of your content.

How is this done?

Semantics. Using semantics, it is difficult to understand how a highly topically diverse site relates to a search query. However, a site about glasiers, polar ice, climatology, geology, etc. becomes easier to understand. These topics are related.

Keep in mind that search is broken into two primary processes. Creating a search engine index, and satisfying a search query. The only link between the two is the index. As detailed as search is, imagine only being able to communicate between the two functions through a database that is so vast and complex. This is a very difficult task.

Do not let RankBrain pass unnoticed. Semantics has always been used for the search query, however, RankBrain is likely specifically developed to take as much advantage of the changing semantic potential of the ever changing search engine index as is possible. As the index grew in complexity, the search query engine did not follow as quickly. Enter RankBrain.

Before Google, the search engine world consisted of term matches and probability analysis to determine what searches matched what sites. For a while this was as good as it got and could do a fair job. Google recognized that this methodology was limited and that a well established science, semantics, would significantly improve the odds of a relevant match far beyond the current methods.

Fast forward to today, semantics is a huge part of search with varying degrees of implementation. Google is clearly the leader. But what does this mean?

It means that term indexes are no longer used directly or exclusively. Even Googles original term index that allowed semantic search has been replaced or supplanted with a more topical index. This does not mean that terms are not indexed. It does, however, mean that the primary indexing method is based upon semantic analysis and matrices and not what specific terms exist on what pages. Semantics is primarily about linguistics. On the simplest end, it is about topics.

Are you seeing where I m going?

If a site has topics that are too divergent, the semantic scoring for the site will become diluted. Search engines are not able to compartmentalize diverse topics well, however, they can understand related topics extremely well.

Because search engines are not cognitive, they are highly dependent upon ontologies. Ontologies are conceptually simple databases that can represent dictionaries, thesauruses, fact links, specific topics, relationships, etc. As for some topics, there are specific ontologies created for that topic. Medical science is a prime example. For example, ontologies for disease and diagnosis exist. Also for anatomy sciences. These ontologies can be used to understand content. As well, wikidata.org is an ontology based upon Wikipedia and is also used.

Let me remind you that search engines cannot understand more than what humans can tell it. The primary communication mechanism for communicating human intelligence to an search engine is the ontology. I also want to remind you that this can only be based upon what humans can cull from our conscious minds. In this, search engines are limited and always will be limited. However, these limitations are growing smaller vary fast.

If you have been paying attention to the knowledge graph/vault, you may have noticed that it grows by topic. This is because specific ontologies are used to build the knowledge graph in shifts. This is a clue to the inner workings of Google. Why? Because we know that Google does not only use semantics in regard to linguistics to satisfy search, but fact link based semantics which makes up the knowledge graph as well. This process has been referred to as the Answer Engine. Why is this important to know? Because how semantics matters and effects search is easily evidenced.

So what does all of this mean?

No page exists in a vacuum. Nor does a site. While I understand the sentiment that each page should rank by itself, and indeed each page has it's own PageRank that should not confuse the issue, no page can exist unto itself in search. Pages, and indeed sites, have relationships and these relationships matter. Hierarchical relationships between pages exist both as a physical path and through how you link to a page. As well, relationships between sites and pages exist through inbound and external links. What is not seen are semantic relationships between similar topics and content within the semantic index.

It becomes important that a site be about one topic or a few related topics. The reason is simple. You are building performance strength within the search engine index. In fact, topical focus and linguistic discipline becomes very important.

This is where sub-domains or niche sites fit in.

Another reality that comes into play.

Not every topic is going to perform well. Clearly there are trend based sites that will always perform well, however, there is the danger of missing topical depth. For example, how deep will a site about celebrities be versus any other site? We can see this for SEO sites that follow headline topics lock-step with other sites without deep content. These sites primarily work for trend topics. If the site is not authoritative or done well, then the site likely will not perform well at all.

As well, not all non-trend topics perform well. The reason for this is simple. The topic may not be too interesting, too archaic, very technical, targeted to a specific audience, targets a small niche, etc. Not all sites will enjoy links and search traffic as a return for effort. This does not mean these site have no value. Far from the truth.

Search engines exist as a business. Unfortunately, this means that they are biased toward trending topics, e-commerce, and any other topic that benefits the business model and will largely ignore other sites. In the past few years, it has become harder and harder to find scientific work through all of the trend based blogs. This is particular to certain niches such as SEO. This is unfortunate.

Search engines are highly cognizant of markets and branding. It is an up-hill battle to compete with existing authoritative sites even when the sites are not as good. Obtaining reward for your work requires constant marketing. It also requires upping the ante.

This can mean several things.

For example, increasing the semantic value of a site. The more that is carefully written on a topic in detail can result in minutiae. This topical detail can strengthen the semantic scoring for the site and page. Where this has value is that even for simple searches the semantic score will be stronger than those sites that do not show expertise. Semantic scoring supports several aspects of search including topic strength, expertise derived from the fact link database, educational level, reading level, matched intent, etc., and even social intent. However, going far into areas such as expertise and educational level can mean that matched intent becomes a thinner target audience. As well, there is leakage. For example, if you are politically liberal, your position will leak through despite how careful you are. This is where social intent comes into play. Search engines do score for this. It is a part of the semantics landscape.

Please also understand that a tight scope can mean high authority.

There is so much that goes into topic scope that a whole book can easily be written. But do know this. Topic scope is important.

Size does matter. Sorta.

For a period until recently, Google was rather keen on much larger sites. Part of the reason could be the simple matter that more pages offer more search opportunities. Up and until Google decided that some larger sites were bad for user experience such as, article marketing, business listing sites, link directories, Google has done a bit of an about face and seems to be targeting sites that are large that fit into certain markets seen as primarily being low quality content.

This goes hand in hand with using semantics to divide the web landscape into market segments. Within these segments you may be able to find good sites and bad sites. Google makes the call. However, page count is a trigger metric along with content creation velocity that once was a more positive metric. This trigger indicates that your site may require additional metrics to be evaluated for quality for example.

On the flip-side, in certain markets, such as whois sites, there are some expectations as to page count and any site not measuring up may be seen as not being competitive.

For the rest of us, there is a happy medium. Too many content pages for a particular topic can mean semantic dilution or duplication of content topic and themes. Too many pages can be an indication of not crafting the sites content in a lean and strategic way.

The converse is also true. Too few pages lacks search potential and can indicate that a topic is not covered well or thoroughly.

It is not necessarily about the number of pages. It is about the effectiveness of the pages you have. Too few pages for some site topics may mean that the site will be ineffective. Too many the same thing. It is partially defined by market, topic, and semantics. But for some, a single page site is just right for the purpose and will perform well.

If you are within a topic or market that is not targeted, or your site is not uselessly large often due to some form of automation, then more well written and targeted topical pages can mean more search opportunities. Too few pages less opportunity. But all of this is highly dependent upon the topic, the content creator, and the market. The answer as to whether a large is more effective over a smaller site is not binary or blanket. It is complex and dependent upon quite a few factors. It is not one or the other. It is about effectiveness overall.

Final Thought

One of the things I cannot tell you is where your topics fit into your content schema. You have to use your experience for that. However, what I do want to make sure you understand is this. Search is a mechanical process and in order for your site to perform well, you must recognize that fact and adhere to it in a reasonably educated way. You must deal with the reality on the ground and not enter into a theoretical world. Theory is nice, but not practical. Not in search.

Building a semantically significant site is the key. You are right about building inbound (back) links. It seems nearly impossible. Google is correct in their belief that natural links are best. The reason for this is that these links are semantically significant. Google is giving you a valuable clue. The argument that inbound links do not matter much anymore is both true and false.

Links still matter a lot. However, the relationship between semantic scoring and inbound links are a lot closer than ever before. It is possible to build a site that performs well in search if the site is done well. Links add to this. However, links can no longer trump topical significance. Why? Because using both links and semantics allows Google to more easily discern site quality. The third leg of the stool is how a site performs within the search engine result pages (SERPs). But that is a separate topic for another day.

10% popularity Vote Up Vote Down


 

@Hamaas447

If you create several websites with different root domains & if there are two pages linking to each other from two different domains, it means you have to build up reputation for those 2 domains. Consequently, you will get extra work as compared to building reputation and getting internal links for one domain.
If you go for a niche website (batman), you will go deep in terms of information granularity. But then, you will attract less people. If you have a more general website or a website subject that allows you to cover different subjects, you will attract more people and more links. You can still write detailed pages.
What is cool about subdirectories on the same wordpress install, a page that is not popular can benefit from the link juice from a very popular page in another subject (in a different wordpress category) thanks to a plugin like CRP for wordpress (Content Related pages). In other words, in the super heroes world, Batman can benefit from superman and vice and versa. Unfortunately with a multisite, that would not be possible. you would need to manually code the interactions between pages.
Subdirectories are equal to subdomains according to google SEO: webmasters.googleblog.com/2011/08/reorganizing-internal-vs-external.html

10% popularity Vote Up Vote Down


 

@Smith883

Google prefers content over everything else and the trickery of cross linking subdomains or subdirectories is well known to them and Matt Cutts has called this out in the past.

Having many small sites means you are dividing your traffic up among them. One site that gets all the traffic will rank higher.

Your goal should be to supply quality content that people care about and not work on gimmicks.

10% popularity Vote Up Vote Down


 

@Odierno851

Short answer - Now days, either large or small sites can rank well.

In earlier times, Google was heavily depends on backlinks. Now Google is using 210+ signals (200 signals and other 10+ signals which added after 2013) in their algorithm, and as per my analysis of my Friends sites. I have seen Google does not depend too much on PageRank i.e. they have reduced backlinks importance in algorithm compared to 2014 or 2015.

Niche site can rank very well now a days. Google generally wants to display expert content on top. If a niche site has trusted content, then it can rank highly in the SERPs. For example, my friend's website ranks very well for many competitive keywords. He has not built a single link. You can also watch the John Mueller hangout videos where he said my friends site is getting tons of traffic, without building any backlinks. So Look's like they are continue to changing in algorithm so webmaster can focus on website performance and content rather than on building backlinks.

Bigger sites usually rank well in the SERPs. Google generally treats every webpage as same, but having a bigger site does help with an SEO boost. That is because the site already has reputation from other websites, so it has good domain reputation compared to a new domain name.

So build anything which is easy for you, and let Google do its job for you. Maybe Google will reduce backlinks importance even further in the future compared to now. I know they already reduced it too much, but they can reduce even more, and add other signals in future.

If your content is good, no matter it is on subdomain, a subdirectory, or a different domain, then your website's future is bright.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme