16bit grayscale textures

I’m trying to load a 16bit monochrome texture using the pbuffer, and than read it back to memory. I tried several configurations, but none worked good. I’m using resident textures initialized with this command:
glTexImage2D(GL_TEXTURE_2D,0,GL_LUMINANCE_ALPHA,2048,2048,0,GL_LUMINANCE,GL_UNSIGNED_SHORT, NULL);

when loading the texture, I’m using:
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, w, h, GL_LUMINANCE, GL_UNSIGNED_SHORT, imgBuffer)

when I’m reading the pixels back to the memory, I see that all the components are the same (r,g,b) but wrong. I’m using: glReadPixels(0, 0, x, y, GL_GREEN, GL_UNSIGNED_SHORT, imgBuffer).

I tried several internal formats, such as GL_LUMINANCE, GL_LUMINANCE16(gave only ‘0’), GL_LUMINANCE16_ALPHA16 etc. each gave different output.

I also tried initializing the pbuffer with different configurations. using wglChoosePixelFormatARB, I can’t get a configuration with 16bit color components when I’m using these attributes:
WGL_DRAW_TO_BUFFER_ARB,true
WGL_COLOR_BITS_ARB,64
WGL_GREEN_BITS_ARB,16
WGL_RED_BITS_ARB,16
WGL_BLUE_BITS_ARB,16
WGL_ALPHA_BITS_ARB,16
WGL_DEPTH_BITS_ARB,24
WGL_ACCELERATION_ARB, WGL_FULL_ACCELERATION_ARB

when using wglChossePixelFormatARB with attributes, I can only get formats with 8bits per component, but when I’m calling the function without attributes, I can get the above configuration.
I tried to force this configuration, but I still got wrong values. for example, if my texture was 0xFF or 0xF0, I got back in the imgBuffer 257. if it was 0xF I got “0”.

I’m using Geforce 6200

i dont know if GL_LUMINANCE16 is supported on 6200 (nvidia have a pdf of cards texture formats supported on there site)
but anyways other things to do
*u can perhaps use the HILO texture format
u can use a standard texture eg RGB and store the 2^9 ->2^16 in the Red + 0->2^8 in the green (hmmm maybe u need to minus 1 in those figures) and then u can unpack them in a shader eg X = R256+G