Mobile app version of vmapp.org
Login or Join
Murray432

: Wlll links into pages that are restricted by Disallow rules in robots.txt help SEO? I have a search-engine based website, and the search functionality itself is not crawlable because of a Disallow:

@Murray432

Posted in: #Backlinks #RobotsTxt #Seo #WebCrawlers

I have a search-engine based website, and the search functionality itself is not crawlable because of a Disallow: rule in robots.txt. Other content is on the site is crawlable.

If I get incoming links to the search portions of the site, will they be as good for SEO? (Like links to the front page of the site would be.)

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Murray432

1 Comments

Sorted by latest first Latest Oldest Best

 

@BetL925

Robots.txt does not prevent links from passing pagerank

We know this because sometimes pages that robots.txt disallows end up ranking even when Googlebot can't crawl them. In those cases the "title" of the page that Google uses is usually based on the anchor text from the links. The pages rank based solely on the power of the external links.

I've also done experiments that show when internally linking to pages blocked by robots.txt less pagerank is available to flow to other pages on the site.

Google can't tell what blocked pages link to

So even if blocked pages get some pagerank, they can't directly help the other pages on your site.

Google may re-assign pagerank to uncrawlable pages

Sites that block lots of pages with robots.txt (and link to those pages internally frequently) seem to enjoy good rankings anyway. The same goes for sites that try to use nofollow to sculpt pagerank. The practice doesn't seem to hurt as much as I would expect. This is likely because Google re-assigns "lost" pagerank to other pages on the site. Possibly by distributing it evenly across all indexed pages. Another explanation might be that Google has a concept of "domain authority" that they use in addition to Pagerank that helps in these cases.

Inbound links to blocked pages may help some

So my takeaway would be:


All (non-spammy) inbound links (whether to blocked pages or not) should help some.
Links to blocked pages are not going to help directly. They are not going to help specific pages rank. Nor is the anchor text going to matter much.
Links to blocked pages probably don't help as much as links to crawlable pages (but it's hard to say how much less they help.)


So I would:


Gladly accept inbound links to blocked pages.
Not go out of my way to build links to blocked pages.
Don't try to do anything stupid to claim every last ounce of link juice from blocked links. For example, you wouldn't want to make your site search results crawlable. That would be a direct violation of the Google webmaster guidelines.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme