luminance-alpha and glsl ?

I have a pointer to a luminance+alpha float32 array;
I want to store it in a gpu texture as GL_LUMINANCE_ALPHA16F_ARB
then display luminance+alpha through a glsl fragment shader.

My texture is defined as follow:

glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE_ALPHA16F_ARB, 2048, 2048, 0, GL_LUMINANCE_ALPHA, GL_FLOAT, 0);

Then when needed, I transfer part of my array from RAM to VRAM like this:

glTexSubImage2D(GL_TEXTURE_2D,0,0,0,width,height,GL_LUMINANCE_ALPHA,GL_FLOAT,data);

Here’s the fragment shader I use to display luminance and alpha as red and green:

gl_FragColor = vec4(texture2D(source, gl_TexCoord[0].xy).xy,0.,0.);

However something is obviously wrong, the first thing being that red=green=only one value, where I’m expecting 2 separate values.

-Is there something specific to deal with luminance alpha texture in glsl ?

-Am I ok with the other parts of the code ? (glTexImage2D/glTexSubImage2D)

Thanks

(I’m using GeForce GTX 260 with latest drivers on Windows 7)

For GL_LUMINANCE, L gets copied into channels r,g and b. channel a is set to 1.
For GL_LUMINANCE_ALPHA, L gets copied into channels r,g and b. A ends up in channel a.

GL_LUMINANCE and GL_LUMINANCE_ALPHA are deprecated texture formats. They got replaced by GL_RED and GL_RG texture formats (which behave like you expected). If you want GL_RED/GL_RG textures to behave like the old GL_LUMINANCE/GL_LUMINANCE_ALPHA textures, you can use the ARB_texture_swizzle extension.

Thanks for the info !

However, I’m afraid GL_RG is not backward compatible :
it is not mentionned in this ATI table up to R500 and in this NVIDIA table it only works since G80, not before.

Should I stick with GL_LUMINANCE_ALPHA if I want my app to be compatible for ATI x300 and GeForce 6200 ?

Should I stick with GL_LUMINANCE_ALPHA if I want my app to be compatible for ATI x300 and GeForce 6200 ?

Probably yes, since they don’t support GL3.x+ anyway. So you’re stuck with GL2.1 were GL_LUMINANCE_ALPHA is supported.