: HTTP status 400 code pages are getting picked up by Google, appropriate solutions? Recently we received a notice from Google Webmaster Tools stating "Googlebot found an extremely high number of
Recently we received a notice from Google Webmaster Tools stating "Googlebot found an extremely high number of URLs on your site:..."
We noticed that several of the sample urls listed included urls to pages with 400 HTTP status codes for malformed searches / requests on our site. What would be an appropriate solution to prevent Google from picking up on these pages? Should we change the HTTP status to a 404 instead? We are also considering using a meta content="noindex" tag.
Thanks in advance for your help!
More posts by @Hamaas447
1 Comments
Sorted by latest first Latest Oldest Best
First, you need to assess the URL's that google complains about.
If the URL's contain a bunch of gibberish characters and doesn't seem to fit in with your site, then I'd leave them as 400 status and remove the affected URLs from any sitemaps you have submitted to google. Examples of qualifying URLs should include:
example.com/%20%34%FA%C4aaa%FF/%F3 http://example.com/<>/</>/?/skfhsfh
If the URLs contain normal characters and they blend in with your site but the pages at those URLs no longer exist, then the 410 status should be returned so you won't waste google's time trying to index useless pages.
If the URLs are meant to point to existing pages at some point in the future but not at this time, then the status returned should be 404.
Examples of URLs with normal characters are:
example.com/mypage/page-1 http://example.com/fruits/bananas example.com/fruits/apples.html
I recommend using noindex on thin-content pages returning status 200 (pages with little content) you don't want search engines to index at all.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2025 All Rights reserved.