Mobile app version of vmapp.org
Login or Join
Si4351233

: Best method to avoid link juice dilution when a page has lots of links I have a page where there are say above 500 links. Most (484 of them) are all stacked in 2 table cells. They all

@Si4351233

Posted in: #Links #Seo #Usability

I have a page where there are say above 500 links. Most (484 of them) are all stacked in 2 table cells. They all point to a search result with each link showing a different search result.

What is the best way to improve SEO for this page?

One idea that I think is good is by loading up only few links initially then utilize JS to load the additional links on the fly based on certain events in order to improve both usability and SEO.

10.04% popularity Vote Up Vote Down


Login to follow query

More posts by @Si4351233

4 Comments

Sorted by latest first Latest Oldest Best

 

@Mendez628

As you said, have a certain amount of links visible, then load more with JS.

The first 20 links:

<a href="/link-target/">link</a>


The subsequent links:

<a class="lazy-load" title="/link-target/">link</a>


Followed by:



$(document).ready(function(){
$("a.lazy-load").attr("src", $(this).attr("title"));
});


This will mean at page generation (when google crawls it) there will only be 20 links, yet the page is still html valid.
For the users who actually view the page (presuming they have JS, and you include jquery - it could be done without this), they'll get the full amount of links and know nothing has happened.

10% popularity Vote Up Vote Down


 

@Speyer207

A solution that I would recommend:

Load all the links in an <iframe> and add a rule in your robots.txt to disallow that <iframe>.

10% popularity Vote Up Vote Down


 

@Megan663

Google doesn't want to index pages of search results. You should have a robots.txt rule that prevents Google from crawling all of your search result pages. If you don't, you run the risk of getting a penalty from Google for showing search result pages in their search results.

Linking to pages that are in robots.txt does dilute link juice. I've tested it. As far as I can tell, it is very similar to using nofollow on links. Google drops the PageRank on the floor.

As far as "avoiding link juice dilution" goes, you don't need to worry. Lots of links with nofollows or lots of links to pages in robots.txt have never hurt any of the sites that I have worked with. As a webmaster, you don't need to do anything to preserve PageRank. You might think that this PageRank gets "lost". That isn't the case. Losing PageRank doesn't hurt your popular crawlable pages. It is almost like Google scoops up that lost PageRank and redestributes it across the pages on your site that it can crawl and index.

10% popularity Vote Up Vote Down


 

@YK1175434

A webpage with 500 links is rarely a good SEO optimized page. It's also bad for user experience because it's hard to find the wanted link in the list.

A good method would be to divide your webpage into several by classing those links by categories for example. That way, you can obtain several webpages with less links (better for SEO and user experience). Moreover, don't forget to add some text around those links to give more SEO weight of your new webpages.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme