: Can backlinks to robots.txt blocked pages boost domain's ranking? So, imagine, we were blocking some URLs via robots.txt, but apparently these pages were getting backlinks. Later, we decided to
So, imagine, we were blocking some URLs via robots.txt, but apparently these pages were getting backlinks.
Later, we decided to redirect these blocked pages to another domain of ours.
~1.5 month after we started redirection, our Google traffic dropped significantly.
Can backlinks to robots.txt blocked pages boost domain's ranking, thus hurting our domain's ranking possibilities and SE traffic to our domain after redirecting such pages?
More posts by @Bethany197
3 Comments
Sorted by latest first Latest Oldest Best
To answer this, you have to know something of the mechanics behind how all of this works.
Every time a link is discovered, it is entered into a link table within he index. The source URL, the page with the link, is at least already within the URL table within the index. If the target URL, the page being linked to, is not in the index, it is a dangling link. This means that Google has not investigated both ends of the link yet and therefore no metrics can exist. When a link is discovered and entered into the link table, the target URL is entered into the fetch queue.
When the page is fetched, assuming nothing goes wrong, it is indexed and the link value metrics can be calculated and applied. This is done in a batch process periodically. It runs quickly and the last I heard this was run monthly, however, likely can run more often. The process is recursive. Let me give you an example. If a page has a particular PR, as links to that page are added and dropped, the PR for that page changes and therefore can effect the pages it links to. This is not a linear calculation and must be repeated over and over until the effective change would be 0 or minimal enough to have no real effect. Also please understand that the PR model we see where a PR6 page with 2 outbound (external) links to other pages passes PR3 through each link is completely invalid. Authority caps and link value scores are applied to give a natural curve in the PR model. Otherwise, no site could ever compete against a super authority site and the whole PR model would be forever skewed.
Be that as it may, only links that are complete, meaning links where the source and target URLs exist can have calculations applied. Dangling links and broken links (links to non-existing pages), are missing the requirement of a target page. It is impossible and impractical to pass link value to nothing.
So in your case, since the pages are restricted by the robots.txt file, there is no target URL and therefore no value passed.
What would be needed to further understand your scenario would be a better understanding of exactly what is being redirected and how. Anytime a series of redirects exist, the original page must be fetched to know this. There is a period of time as the search engines fetch these pages, measure the value of these pages, change the metrics, and compare targets against the robots.txt files, etc. All of this takes time and has the potential to be disruptive. It is possible that you are working through the process now.
My advice to people is to make life as simple for the search engines as possible and not to confuse things too much. Search engines like 404 errors and 301 redirects, however, a 301 redirect can have funny results sometimes and not as webmasters always expect. If it is possible to think this through a bit and simplify things, this may be a good opportunity to do so.
It makes sense to split your question up in two parts, each addressed separately.
Can backlinks to robots.txt blocked pages boost domain's ranking?
No, these backlinks shouldn't be affecting your domain's authority and thus rankings if they are pointed to pages which are blocked using robots.txt.
However, I can imagine search engines not completely discounting them because after all the links are a vote of confidence for your website.
Thus hurting our domain's ranking possibilities and SE traffic to our domain after redirecting such pages?
So in theory it shouldn't matter (that much) if you redirect these URLs to another domain. Unless my theory about search engines not completely discounting these links above is true, then it can.
To get a better understanding of the situation: what do you mean with 'significantly'? And was there any relation between the URLs that dropped in rankings (which currently still exists) and the URLs that were redirected?
Was anything else changed during the same period? In situations like this is very important to be able to rule out as much other reasons as we can.
If you stop the wrong URL´s from gettings accessed on your page, you might lower traffic or render backinglinks ineffective, yes
Example:If you block traffic to your e.g. contact page(/contacts.html), google and other search engines will get to know this and stop linking your contact page
You should only block URL´s on your robots.txt which´s content you dont want to have in the net.
Check if you have done this correctly and not accidentally included other important URL´s.
The redirection to other domains is rendering all backinglings to this content unvalid for the ranking of your page, cause bots dont see the relation of the backing link to your page any more and therefore rank the existing backlink for your other pages(your new domain). Therefore maybe cutting the link for your page(old URL) to other high value pages who were linking you. Without these backinglinks your page might have fall under some threshold were your contant generally is getting ranked lower because of not so much high value backing links.
Your options now could be to eather try reverting the changes. Or just work on building up new backing links.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.