GL_INTENSITY16

Hello,

There seems to be some problem using glTexImage2D() using GL_INTENSITY16 … basically i have 2D texture which has only a single value, which I’d like to texture onto a polygon. The value is 16-bit unsigned int and so I am using GLuint and GL_UNSIGNED_SHORT for texturing.

When I store the texture as RGBA wherein I copy the single value into R, G & B and then make A = 1, the texturing seems to work fine. Please let me know if I am doing anything wrong and is the use of GL_INTENSITY16 appropriate.
Thanks,
aj

Yes use this:

internalformat : GL_LUMINANCE16

format : GL_LUMINANCE

type : GL_UNSIGNED_SHORT

You could also try varying the internal format to INTENSITY* depending on what you want the fragment alpha to do. Remember that your graphics card may not support 16 bit textures, so you’ll get a white texture. So, if it doesn’t work try GL_LUMINANCE12 and GL_LUMINANCE8 for the internalformat param.

Use the proxy texture mechanism to determine what is supported at your required resolution at runtime.

dorbie,
sized internal formats aren’t enforced. A proper implementation would not fail if it doesn’t support the exact requested resolution, but it would convert to one of the supported internal formats.

[This message has been edited by zeckensack (edited 02-19-2004).]