16 bit alpha?


MSDN says that alpha bitplanes are not supported. Is there any way to have more than 8 bit alpha?


Render directly to a texture. Use PBuffer or Frame Buffer Object (FBO’s are more flexible and much simplier) and render to a texture with GL_RGBA_FLOAT16_ATI format for example.

Note that not all cards support blending when rendering to FP16 texture. It will require GeForce6 or radeon X1*00.

But if you only need 16-bit alpha in a source texture when rendering to standard 8-bit framebuffer blending will work just fine. However - use only GL_NEAREST or GL_NEAREST_MIPMAP_NEAREST filtering on fp16 textures. Again - GeForce6 will support GL_LINEAR filtering, but Radeon X800 will not.

MSDN is talking about the first implementation of GL on Windows 95. It didn’t support destination alpha for the backbuffer.

Nowadays, all video cards support RGBA8 for backbuffer. There is also the accumulation buffer RGBA16 which is accelerated by modern cards. There are a bunch of other formats as well (float and unsigned int).

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.