: Is there 0-255 color standard? So, I was tasked with job where I will write a C# application that will go through an image and list out the color of each pixel. Problem is, I'm not really
So, I was tasked with job where I will write a C# application that will go through an image and list out the color of each pixel. Problem is, I'm not really sure what standard the color needs to be. The spec says:
'px (a value in the result set)' shall be the pixel color at a
specific spot of the image. The pixel color shall be between 0-255,
inclusive.
Later in the spec, it gives an "example of a 3x5 image," then proceeds to display 3 rows of 5 single numbers between 0 and 255.
Obviously it isn't asking for the Hex or CMYK values. It seems like it could be asking for the RGB, but it really seems like it's asking for a SINGLE integer value between 0-255 that fully identifies the color of a particular pixel.
Does such a standard exist?
This is an excerpt FROM the spec:
Here is an example of a 3x5 image file, “babydrawing.img”.
255, 6, 65, 78, 99
100, 25, 0, 45, 66
88, 190, 88, 76, 50
More posts by @Moriarity648
2 Comments
Sorted by latest first Latest Oldest Best
I can understand the confusion, when normally one expects, for RGB a three (or four) byte number 0xRRGGBB, when RR, GG and BB range from 0x00 to 0xFF. You may think that RGB8 is a similar standard with just 2 bits per Red, Green and Blue (and a 2 bit alpha channel). But no¹...
From the sounds of it, the image is a bitmap² and is using an 8 bit colour map, or 8bpp, so there are 256 distinct colours supported.
Each byte represents one pixel³ and is an index to a table containing 256 colours. This table is held in memory, and may contain either user defined or a standard system defined set of colours, known as a colour palette.
The actual colours in this colour palette are not set in stone. There is no actual standard, apart from a standard relating to a particular OS. It is possible to change the colours in the colour palette, so that the colours to which the indices refer change, so called colour palette animation and can be used to create some rather psychedelic effects.
The values can indeed represent an increasing greyscale, but in this case the OP states that each number represents a colour and not a shade of grey. In which case, it is using a colour palette.
So, in short, an answer to the OP's question is that there is no set colour for each pixel, the C# program merely has to obtain the colour index for each pixel. The actual colour that the index represents is irrelevant, until the colour palette has been defined.
I hope that makes sense.
As an historical aside, not so long ago (mid 80s), 8 bit video cards were once at the top end of the technology, current at that time. Then along came 16 bit colour (thousands of colours) in the early 90s, then 24 bit colour (millions of colours), in the mid 90s (Apple Quadra Series) and now 32 bit is de rigeur.
¹ Although there are exceptions (3 bits for R and G and 2 bits for B), credit goes to joojaa for the tip.
² There are a number of bitmap formats: BMP (Windows); XPM (X Windows); TIFF and; PPM to name just a few. See Bitmap image file extension list.
³ It might be worth also having a look at the Wikipedia entry for Pixel Format.
1 byte length integers are pretty normal for images, you can say its standard. Hexadecimal is just same thing with different notation (ff=255, 00=0)
1 component per pixel is also not terribly rare, its a grayscale image or any of many other things such as alpha.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.