16bit textures....what parameters again?

Hello again people!

I’ve got a 16bit image that I would like to load as a texture.But I’m not sure as to what I should fill the parameters for the glTexImage2D function should be.Help!

I know for a 24bit image it should be:

glTexImage2D(GL_TEXTURE_2D, 0, 3, Width,Height,0, GL_BGR_EXT,GL_UNSIGNED_BYTE, Data);

How should this statement look like for a 16 bit texture?

Originally posted by TheGecko:
[b]Hello again people!

I’ve got a 16bit image that I would like to load as a texture.But I’m not sure as to what I should fill the parameters for the glTexImage2D function should be.Help!

I know for a 24bit image it should be:

glTexImage2D(GL_TEXTURE_2D, 0, 3, Width,Height,0, GL_BGR_EXT,GL_UNSIGNED_BYTE, Data);

How should this statement look like for a 16 bit texture? [/b]

Use this call:

glTexImage2D(GL_TEXTURE_2D, 0, 3, Width,Height,0, GL_BGR_EXT,GL_UNSIGNED_SHORT, Data);

Kosta

Originally posted by TheGecko:
[b]Hello again people!

I’ve got a 16bit image that I would like to load as a texture.But I’m not sure as to what I should fill the parameters for the glTexImage2D function should be.Help!

I know for a 24bit image it should be:

glTexImage2D(GL_TEXTURE_2D, 0, 3, Width,Height,0, GL_BGR_EXT,GL_UNSIGNED_BYTE, Data);

How should this statement look like for a 16 bit texture? [/b]

Use this call:

glTexImage2D(GL_TEXTURE_2D, 0, 3, Width,Height,0, GL_BGR_EXT,GL_UNSIGNED_SHORT, Data);

Kosta

HEy Kosta…that didn’t work My application crashes.

Originally posted by TheGecko:
HEy Kosta…that didn’t work My application crashes.

What do you mean when you say you have a 16bit texture? Each color component (R, G and B) consists of 2 bytes? If so, you must have allocated at least withheight2 bytes. Probably you have to change to pixel alignment via glPixelStore*()?

Kosta

No, you’re wrong. The third argument will specify how your pixels are encoded into the texture.

AFAIK, the call
glTexImage2D(GL_TEXTURE_2D, 0, 3, Width,Height,0, GL_BGR_EXT,GL_UNSIGNED_BYTE, Data);
lets OpenGL choose the texture format.

To use a true 24 bits format, use
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, Width,Height,0, GL_BGR_EXT,GL_UNSIGNED_BYTE, Data);

and to use the 16 bits format, use
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB4, Width,Height,0, GL_BGR_EXT,GL_UNSIGNED_BYTE, Data);

Check out this page: http://trant.sgi.com/opengl/docs/man_pages/hardcopy/GL/html/gl/teximage2d.html

Y.

Originally posted by Ysaneya:
[b]No, you’re wrong. The third argument will specify how your pixels are encoded into the texture.

AFAIK, the call
glTexImage2D(GL_TEXTURE_2D, 0, 3, Width,Height,0, GL_BGR_EXT,GL_UNSIGNED_BYTE, Data);
lets OpenGL choose the texture format.

To use a true 24 bits format, use
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, Width,Height,0, GL_BGR_EXT,GL_UNSIGNED_BYTE, Data);

and to use the 16 bits format, use
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB4, Width,Height,0, GL_BGR_EXT,GL_UNSIGNED_BYTE, Data);

Check out this page: http://trant.sgi.com/opengl/docs/man_pages/hardcopy/GL/html/gl/teximage2d.html

Y.[/b]

I think, the given image data (the data pointed to by the variable “Data”) consists of 16bit per color component.

Something like this:

GLushort* Data = …

So, its not the internal data representation that should be changed but the “type” parameter specifying that you use GLushort instead of GLbyte!

Kosta

And 24bit image consists of 24bit per color components, right?

You can use 16bit images directly only if 3D card has GL_EXT_packed_pixels extension.

No. 24 bits = 24 bits per pixel = 8 bits per component = 1 byte for each component

Exactly…
And the question was about 16bit image but not about 16bit per color component.