: How can the Google Analytics "Screen Colors" metric be remotely accurate? This metric has been around a long time in GA, but only today have I realized I don't think it's possible for it to
This metric has been around a long time in GA, but only today have I realized I don't think it's possible for it to be useful. Is there some way for GA to tell what monitor a user has? As far as I know, they can only tell what the system color is set to.
The reason this is an issue is that nearly all of the cheap LCD panels sold are only 6bits per pixel. compreviews.about.com/od/multimedia/a/LCDColor.htm www.tftcentral.co.uk/articles/6bit_8bit.htm
It seems like the majority of users should show up as 6bit (18bit combined, 24bit if you count alpha) color. Instead, looking at some high traffic sites, 32bit is the vast majority.
Is there some way this GA metric can be accurate?
More posts by @Jennifer507
2 Comments
Sorted by latest first Latest Oldest Best
The GA screen color metric is accurate. Whether a monitor uses spatial dithering or temporal dithering to achieve its perceived color depth is irrelevant. At the end of the day, it's the resultant color depth perceived by the end user which matters, not the technology behind it. So Google Analytics is measuring the only meaningful color depth metric to webmasters.
It's much more useful for a webmaster to know what percentage of visitors to their site will see the site in 24-bit color versus 18-bit color than it is to know what percentage of their visitors have monitors with 8-bit subpixels versus 6-bit subpixels.
Because an "18-bit" TN display which GA reports as 24-bit is going to produce an end-user experience much closer to 24-bit S-PVA or S-IPS displays than to displays that produce actual 18-bit output (which GA would report as being 18-bit).
After all, pretty much all color displays only similuate their color range. Just like halftone printing, digital color displays actually only produce light in a small set of primary colors. It's only when our eyes/mind blend the close-packed (usually RGB) subpixels that we get the perceptual colors and color gamut we experience.
And it's not just printing and displays. The same is true of digital imaging sensors in cameras, which use Bayer filters and post-processing algorithms to simulate the color depth and resolution that they advertise. Heck even your visual perception uses the same cheat to generate a continuous range of colors using tristimulus values from the respective set of cone cells.
Edit:
I just want to point out this paragraph at the end of the TFT Central article you linked to:
In terms of colour accuracy and colour reproduction, the use of 6-Bit
+ FRC doesn't make a massive difference. Modern TN Film panels for instance can offer very good colour reproduction qualities and DeltaE
can be very low on colour calibration graphs. To the average user,
colours still look very good and it might well be very hard to tell
any real difference between a 6-Bit and an 8-Bit screen.
Sure, if the end user decides to run the color gradients test, then a 6-bit color depth device using spatial dithering will perform very differently from an 8-bit color depth device. However, a 6-bit color depth device using temporal dithering (via A-FRC) will perform very much like any other true 8-bit color screen. Additionally, on most websites, you're unlikely to see exactly the type of gradient that will produce banding on 6-bit panels and 6/9-bit panels but not on 8-bit panels.
In contrast, if you compare displays that are outputting at 16-bit or 18-bit color (like the iPaq 200), you'll immediately notice decreased color range compared to 24-bit color displays (whether using 8-bit or 6-bit subpixels). And this is going to affect webmasters much more than the technological nuances used to achieve the same perceptual color depth.
There are plenty of things to be concerned with when choosing LCD displays (gamut, viewing angle, response time, luminance, contrast ratios, etc.), but with modern LCD displays, 6-bit versus 8-bit subpixels, just like ghosting, doesn't really matter. As the TFT Central FAQ puts it:
A. There is a lot of talk about colour depth on TFT screens. It's
important to put this into perspective though, and not jump on the
bandwagon of 8-bit being much much better than 6-bit. Yes, 8-bit
displays are preferable, and can offer an improved colour palette,
more freedom from grading and banding, and are the choice for colour
critical displays. However, modern 6-bit screens use a range of FRC
technologies which can offer some decent results. On some 6-bit +FRC
panels colour range is good, screens show no obvious gradation of
colours, and they show no FRC artefacts or glitches.
Manufacturers use 6-bit panels (+FRC) to help keep costs lower, and
for the majority of users I would suggest it is difficult to tell the
difference in practice between a 6-bit or 8-bit panel. Colour accuracy
of modern 6-bit panels (mostly TN Film) is also very impressive, an
area which used to be lacking. If you're an average user, you
shouldn't worry too much about the situation too much, most users will
find a 6-bit panel perfectly adequate for their needs. If you need a
display for colour critical work, then you should certainly consider
the graphics range from manufacturers which all use 8-bit (or above)
colour depth.
GA is reporting the browser's DOM value of the for screen.colorDepth
Example:
<script type="text/javascript">
document.write("Color Depth: " + screen.colorDepth);
</script>
My screen reports at 24.
I don't see dates on either of the documents you cite and wonder if the figures are up to date. The http headers for the tftcentral article indicate that the page is almost 4 years old.
In my own GA stats I see 50.5% at 24-bit, 47% at 32-bit, 2% at 16-bit...
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.