Is there anyone konw about the principle how the const int *value translate into
The way the OpenGL specification says it does. It’s exactly as stated in the type: a single 32-bit integer, where the highest 2 bits are the last value (there is no GL_INT_2_10_10_10; only GL_INT_2_10_10_10_REV, so it’s in reverse-order). Then there are 10 bits for the third value, 10 for the second, and the lowest 10 bits are the first. Each group of bits is a two’s-compliment signed integer value.