Problem with integer texture

Hi,

I am working with integer textures on a G80. Now I got this problem:

On CPU side I have this array: unsigned char a [] = {1, 2, 3, 4, 5, 6}. Then I write it into a texture with the following attributes:
width: 1, height: 6
target: GL_TEXTURE_RECTANGLE_ARB
internalformat: GL_LUMINANCE8UI_EXT
format: GL_LUMINANCE_INTEGER_EXT
type: GL_UNSIGNED_BYTE

Now reading the texels form the texture at the coordinates (x,y) gives this result:
(0.5,0.5) = 1
(0.5,1.5) = 5
(0.5,2.5) = not specified
(0.5,3.5) = not specified (and so on)

It is like 3x 8 bits are missing. Using the same texture with an unsigned int array on CPU side works just fine.
On the other side using an unsigned char array, but a texture with the internalformat GL_RGBA8UI_EXT and the format GL_RGBA_INTEGER_EXT also works.

Anyone an idea? Am I doing something wrong?

I’m grateful for any help. :smiley:

Try putting:

glPixelStorei(GL_PACK_ALIGNMENT,1);
glPixelStorei(GL_UNPACK_ALIGNMENT,1);

somewhere in the beginning of your app.

N.

Setting the GL_UNPACK_ALIGNMENT to 1 did it. Thanks a lot.