OpenGL access to 64bit (16bit per channel) texturing , filteriing and blending


I have been working on getting textures of 16bit precision per channel recently, playing with pbuffers etc.

I was reading an nvidia document that claims the following

And best of all, NVIDIA’s 64-bit floating point texture filtering and blending
technology is implemented in hardware. There is no pixel shader encode or decode
to deal with. Furthermore, it is already exposed in Microsoft DirectX® 9.0 and
OpenGL® APIs.
am i correct in thinking that this is exposed through NV_float_buffer, ATI_texture_float, WGL_ATI_pixel_format_float and the upcoming (released but patented) ARB_texture_float and ARB_colour_buffer_float ?


Yes, these features will be exposed through the ARB_texture_float/ARB_color_buffer_float extensions and on current shipping drivers they are exposed through the ATI_texture_float/WGL_ATI_pixel_format_float extensions on all GeForce 6 series GPUs.

The NV_float_buffer restrictions on blending and texture type do no change though.

thanks, thats what i needed to know !

There is a simple example here: