Mobile app version of vmapp.org
Login or Join
Tiffany637

: Is serving a blank page rather than a 404 error bad for SEO? My website is set up in such a way such that any rubbish URL (e.g. example.com/asdfgh) will get a blank page and not a 404

@Tiffany637

Posted in: #Seo #Soft404

My website is set up in such a way such that any rubbish URL (e.g. example.com/asdfgh) will get a blank page and not a 404 error. I am not sure if its bad or good for SEO. Could you let me know what do you think about this.

This is only reaction on this topic that I found on net:


This is not an error, there exists unlimited URLs for any canonical URL. Because when you add something to a valid URL, it is just handled as parameter. When the parameter is not used within the view, then the parameter is not without any effect => "unlimited" valid URLs and that is not an error, that is by design. Search engines handle that completely correctly. 404 is actually a code for an not valid link, but our job is to assure that there are no non valid links. Remind the old times, when you wrote a page with static HTML, and you used a link to a non-existing file, for that you have the 404. But in a CMS, you always open the /index.php, which is always valid. So even our simulation of 404 is nonsense, it should be 303 or so, actually.

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Tiffany637

3 Comments

Sorted by latest first Latest Oldest Best

 

@Margaret670

Google does not care what you return on 404 HTTP response, because they don't crawl anything inside 404 page. If you have custom 404 error page, and it contains some links then Googlebot will not going to crawl those links. Google when request some resources and it return 404 error then they don't see anything on that, and report back to webmaster search console that, this URL is not found, so the webmaster know about it.

For some people it's weird but that's true, Googlebot respect your server HTTP response heavily For example when Googlebot try to crawl your site robots.txt and it's return 5XX error then they don't crawl your website as well. Similarly when they see your server return 404 error, then it's simply don't look into to that webpage, while our actual browser able to display custom 404 error page, but googlebot don't see that, here is official statement from John mueller.

Make 404 pages, which is useful for your visitors, and analysis your 404 pages from time to time by using Google search console, if some pages have good backlinks and it return 404 error page, then better to implement 301 redirection on relevant sources, so you can save the juicyrank.

10% popularity Vote Up Vote Down


 

@Caterina187

Google does indeed call these soft 404s...Soft 404s are a pet peeve of mine.

They are very common, many of the sites I have access to the Google Search Console data for have them.

But they shouldn't.

If a server can't return the content the client is requesting, it should inform the client (by returning an error message via HTTP header, not returning a '200' series "success" header).

According to Google, 404s won't harm your rankings

Regardless, here's the code from my "default" 404 page:

<h1>Page Not Found</h1>
<p>Something isn't right here, the URL you requested isn't available.</p>
<p>Go to the <a href="/">home page</a>.</p>
<p>Try the <a href="/sitemap/">sitemap</a>.</p>
<p>Maybe the <a href="https://web.archive.org/">Internet Archive's Wayback Machine</a> has a copy of that page…</p>
<div id="wb404"></div>
<script src="https://archive.org/web/wb404.js"></script>
<p>Or maybe Google can help…</p>
<script> var GOOG_FIXURL_LANG = (navigator.language || '').slice(0,2),GOOG_FIXURL_SITE = location.host; </script>
<script src="https://linkhelp.clients.google.com/tbproxy/lh/wm/fixurl.js"></script>


It's a bit of overkill but I offer users:

1.) an explanation that the page they requested in unavailable

2.) a link to the home page

3.) a link to the HTML sitemap

4.) a script from archive.org that attempts to show old versions of the page from The Wayback Machine

5.) a Google script that attempts to guess what they were requesting

10% popularity Vote Up Vote Down


 

@Si4351233

Dead links are dead links and search engines won't give you brownie points for a page with no content. From a user perspective it's always good to have something for every invalid URI they try. From your perspective it's better not to leave visitors hanging as to what happened.

So handle invalid URIs gracefully. Be liberal with what you accept.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme