: My website was hacked which is now recovered but the hacker indexed 5000 URLs in Google and now I get error 404 A 404 is probably preferable to blocking with robots.txt if you want
My website was hacked which is now recovered but the hacker indexed 5000 URLs in Google and now I get error 404
A 404 is probably preferable to blocking with robots.txt if you want these URLs dropped from the search engines (ie. Google). If you block crawling then the URL could still remain indexed. (Note that robots.txt primarily blocks crawling, not indexing.)
If you want to "speed up" the de-indexing of these URLs then you could perhaps serve a "410 Gone" instead of the usual "404 Not Found". You could do something like the following with mod_rewrite (Apache) in your root .htaccess file:
RewriteEngine On
RewriteRule ^+ - [G]
More posts by @Ogunnowo487
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.