: Will it effect SEO if there are 30 links on a page where the destination is blocked by robots.txt I have a webpage on which there are around 30 data points which link to 30 different pages
I have a webpage on which there are around 30 data points which link to 30 different pages on my own site which are non-crawlable and have been blocked via robots.txt
My question is if we use href over these blocks with the non-indexable links does this has any impact on the SEO of the page?
Should this be handled by JS rather than href to avoid above or if href's are used then nofollow should be used. But if none of above has been done and href has been used, would that be impacting the SEO in a negative sense?
More posts by @Alves908
2 Comments
Sorted by latest first Latest Oldest Best
First of all. A robots file does not in any way block crawlers. If you want to actually block them you need to use Apache (.htaccess) or PHP. Using a robots file is like putting up a sign in your yard that says "no trespassing." Using Apache is like putting a big fence around your yard.
Non indexed pages will neither help nor hurt your SEO unless there are things on those pages that really should be indexed.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.