: Method for removal of high number of spammy links I haven't been able to figure out the right combination for deindexing spammy links from search results. On one hand, we have the Google URL
I haven't been able to figure out the right combination for deindexing spammy links from search results. On one hand, we have the Google URL remover which removes links temporarily (they may show up again after 90 days). On the other hand, there's no promise that Google would act on your disavow links request expeditiously. In such a scenario, how do I get a high number of URLs deindexed using Robots.txt or the meta robots tag? Besides, other than cleaning up malicious codes, should I be showing a 404 not found in every page?
More posts by @Martha676
2 Comments
Sorted by latest first Latest Oldest Best
404 isn't ideal.
Serving 410 error code is better as it indicates permanent removal. Also Adding nofollow/noindex meta tags/robots.txt on those bad URL's is a good way to speed things up.
Serving correct error codes, meta data, redirecting, disavowing incoming links would all be viable process for different reasons.
Disvow will help google understand you don't want those links pointing to your site counted as a part of your backlink profile.
Yet while those links exist ( even under disavow ), you will get pages they are linking to trying to be indexed, hence why you need to serve the correct meta data, error codes and making appropriate redirects where it seems logical.
It's tough to give highly accurate advise without seeing the problem in first hand, but the above is where most should get the basics right.
Do a 301 redirection to some dummy page, It will not affect your SEO ranking. I had too many links in my website, It was difficult for me to maintain and keep the content updated. I did 301 redirection to the similar page,It was helpful.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2025 All Rights reserved.