Mobile app version of vmapp.org
Login or Join
Jamie184

: Is it a good practice to make low quality links un-crawlable to increase link juice? Is it a good practice to make low quality links un-crawlable for search engine spiders? Usually we put nofollow

@Jamie184

Posted in: #Backlinks #Links #Seo

Is it a good practice to make low quality links un-crawlable for search engine spiders? Usually we put nofollow but that divides link juice. We would like to increase link juice to all other do-follow links?

if yes, is there special JavaScript that we should use?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Jamie184

1 Comments

Sorted by latest first Latest Oldest Best

 

@BetL925

There is no need to worry about lost link juice for internal links. You "lose" link juice when you use nofollow or when you link to something in robots.txt. Many sites do one or the other of those with a large number of links on each page and yet they continue to rank just fine.

I've experimented with this before. I've tried to hide links from Googlebot so that Googlebot conserves every ounce of Pagerank. It doesn't help.

My theory is that Google uses internal links to assign a weight to each of your pages and then scales those weights by your domain rank. "Losing" internal Pagerank ends up not mattering because of that scaling factor.

Writing links with JavaScript won't even help these days. Googlebot does a great job of parsing JavaScript and can render pages just like a real browser does. Googlebot would be able to see any links created by JavaScript just as well as normal HTML links.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme