Mobile app version of vmapp.org
Login or Join
Alves908

: Is it possible to ruin your SEO by cross-linking, overly structuring a website? We're trying to categorise our website data more and more. This will lead us to have many many categories for

@Alves908

Posted in: #GoogleSearch #SearchEngines #Seo #WebCrawlers

We're trying to categorise our website data more and more.

This will lead us to have many many categories for differect setions of information.

Can doing too much of this lead to bad SEO?

My feeling is that if not done correctly, silos can occur, or if not silos, the crawler may be linked to the same page over and over through different routes.

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Alves908

3 Comments

Sorted by latest first Latest Oldest Best

 

@Lee4591628

Google has no problem with Number categories. I always wanted a Great content with easy navigation for the visitors.

I suggest you to keep the categories in simple and easy to navigate through the site. This would help the users to get what they wanted in less time and satisfies him

10% popularity Vote Up Vote Down


 

@Kimberly868

The issue isn't 'can Google crawl it' - because yes it can. It's whether you're passing page rank and link equity.

Page rank is something that is widely reported as depreciated. However, there's evidence it still factors into Google's algorithm when it is weighting the importance of linked pages. If you have a heavily backlinked page, the pages it links to will gain an associated boost. Be careful about leeching these pages when you may be able to use them to boost important pages. Footer links tend to be ignored for these considerations, if that helps.

There's two simple techniques you can use to improve your sites appearance to the crawler, without having to change the link structure:


Create breadcrumbs with schema microdata to allow Google file content correctly and create a heirarchy.
Use a clear folder structure. example.org/Category/Sub-Category/Product/ This communicates the

10% popularity Vote Up Vote Down


 

@LarsenBagley505

I happen to have a pretty large site myself with lots of categories, and google doesn't have problems with it.

What I suggest is to not include URLs in your sitemap that are not meant to be indexed in search engines (because they're labeled "canonical" or "noindex") because then you're over-organizing your site and google won't index those pages anyways.

Also, don't advertise URLs in any way on your site or in a sitemap that return a page with an HTTP 4xx status code. It's best to always link to pages returning an HTTP 200 status code (of success). The second best thing is to return an HTTP 301 status code (permanent redirect) which redirects to a page with HTTP 200 status code.

Other than that, having a gigantic website with tons of categories linking to other categories on your site is not an issue. search engine crawlers are smart enough to know that when they crawled a URL, they crawled it and will likely not re-crawl it until at least some time has passed then the next round of website crawling begins.

Also make sure you have your domain listed in Google search console and check the crawl errors report regularly to ensure everything is linking correctly on your site. One mistake, and google will display it for you.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme