I have been searching unsuccessfully to try and determine what I thought was a simple question: What does my OpenGL application (actually it is WebGL but I assume it is the same answer) have to do differently to properly support 30 bit color displays? It is a simple 2D image display application (medical images) in which I have implemented my own Fragment Shader. In this shader, I am already passing in and dealing with 16 bit grayscale textures to generate the appropriate normalized gl_FragColor value. Is it as simple as being up to the OpenGL driver to properly convert from this normalized value to a 30 bit (10 bpc) image to display or is there actual additional coding that needs to be done elsewhere in my pipeline to support these wide gamut color displays?
nVidia Quadro GPUs support this:
[li]10- and 12-bit Grayscale Technology [/li][li]30-Bit Color Technology for NVIDIA Quadro [/li][li]How to enable 30-bit color on Windows platforms [/li][li]How to Enable 30-bit color on Linux [/li][li]Configuring Depth 30 Displays (NVidia Linux driver README) [/li][li]10-bits Lookup Table (devtalk forum post) [/li][li]Grayscale10bit.zip (NVidia 10/12-bit Grayscale Sample Code) [/li][/ul]
as do some AMD consumer and professional line GPUs:
[li]AMD’s 10-bit Video Output Technology [/li][/ul]
NVidia says their GeForce GPUs since the GeForce 200 series do (with DirectX fullscreen windows), but provide little detail on that:
[li]10-bit per color support on NVIDIA Geforce GPUs [/li][/ul]