I have color data in a bitstream as plain BBGGGRRR, where
pixel = some i in range(0, 255); R = (pixel & 0b00000111); // first three bits G = (pixel & 0b00111000); // three middle bits B = (pixel & 0b11000000); // last two bits
But reading the man page for
glTexImage2D I’m somewhat confused as to which type I should specify so that my data is interpreted this way. I am assuming one of
GL_UNSIGNED_BYTE_2_3_3_REV, but I can’t test it out on this computer right now and the doubt is killing me!
I didn’t make this image. But it is exactly the spectrum I’m using:
^this is what I’d expect to see when sending in an array with all values from 0 to 255.