I have color data in a bitstream as plain BBGGGRRR, where
pixel = some i in range(0, 255);
R = (pixel & 0b00000111); // first three bits
G = (pixel & 0b00111000); // three middle bits
B = (pixel & 0b11000000); // last two bits
But reading the man page for glTexImage2D I’m somewhat confused as to which type I should specify so that my data is interpreted this way. I am assuming one of GL_UNSIGNED_BYTE, GL_UNSIGNED_BYTE_3_3_2 or GL_UNSIGNED_BYTE_2_3_3_REV, but I can’t test it out on this computer right now and the doubt is killing me!
I didn’t make this image. But it is exactly the spectrum I’m using:
^this is what I’d expect to see when sending in an array with all values from 0 to 255.
That’s format=GL_RGB, type=GL_UNSIGNED_BYTE_2_3_3_REV. For packed types, the numbers describe the field widths starting from the high bits, so 2_3_3 is 0bxxyyyzzz. For non-_REV types, the fields are ordered from high bits to low bits, so GL_UNSIGNED_BYTE_3_3_2 is RRRGGGBB; for _REV types it’s the reverse, so GL_UNSIGNED_BYTE_2_3_3_REV is BBGGGRRR.
Note that GL_BGR isn’t accepted for any packed type (although GL_BGRA is). You can use glTexParameter with the GL_TEXTURE_SWIZZLE_* enumerants to re-order the components of a texture as they’re read.
There is no real advantage to using BBGGGRRR, many of the newer GPU’s will convert it to a BGR 8 bit format on loading. So in the end you are converting it twice on the CPU for loading and unloading from the GPU.
It should also be noted that Vulkan doesn’t even offer it as a possibility for implementations to support. Since Vulkan has plenty of image formats that aren’t required to be supported, the fact that it’s not even an option that implementations could expose support for suggests that basically no hardware of note natively handles BBGGGRRR.