I’m using a vertex array where the RGBA components are GL_UNSIGNED_BYTE.
When the alpha component is 0xFE vs. 0xFF, it’s affecting the blue component in a bizarre way.
When A = $FF, it seems to make B || $C0. Then when B goes to $C0 it behaves as normal up to $FF. This results in the effect of when the color is 0xfe008080 or 0xff008080, the 2nd color actually appears noticably bluer, yet 0xfe0080c0 and 0xff0080c0 appear the same.
Why is this happening? When I disable the alpha channel, it looks fine, but of course I have no translucency then.