Gamma correction table

We’re trying to gamma-correct for a nice new LCD projector that we have so that the values in the framebuffer are converted linearly into brightness (i.e. 128 is half brightness). We know this is going to cost us in terms of banding in the low range, but we need the linearity for color-correction that we’re doing.

Now the interesting part is that the projector doesn’t have a simple exponential response to input values (which is what gamma corrects), but it’s more like an S-shape.

Thus standard gamma values won’t help us, we need access to the lookup tables inside the card to adjust for that. Is there any way to directly manipulate the lookup tables? Our apps are OpenGL, but as this is probably a global setting it should be possible to use a D3D tool, if really necessary.

Any hints welcome. We could probably get pretty far using dependent texture tricks, but the performance cost is not encouraging, and the hardware has to do it anyway. We just need a tiny little API…



You need a video lookup table. I’ll bet that’s how it’s done in practice on some cards just to support gamma correction. I don’t think there’s an exposed API to write this on PCs. It would be very cool.

In Windows you can set the gamma table using SetDeviceGammaRamp. Is that what you were after ?

Microsoft was supposed to add gamma lookup table support to ICM 3.0 (but the document I am looking at was written in 1998, so who knows).

An interface is exposed in DirectDraw 7 and 8 (GetGammaRamp and SetGammaRamp).

You may be able to use the one from Direct Draw 7. It requires that you create a primary surface and then get that surface’s IDirectDrawGammaControl interface. I think you can do this and still use a window for OpenGL rendering. But I’ve never used DirectX to initialize fullscreen mode, so I don’t know what you can and can’t do and still be able to use OpenGL.

Looks like that’s when he requires PH. Rather than a single value it’s a full 3*256 array of entries.

Hmm, yup, that looks like what I’m looking for. Nice!

Has anybody actually used this function and/or has experience how widely it is supported?

And now the tricky part: does anybody if there’s an equivalent for Linux?

Thanks for the quick response


Yes, I’ve used it. It’s supported on all cards that I’ve used ( 3Dfx, ATI and NVIDIA. Supported on the old matrox mystique too if I remember correctly ). If you need to support Voodoo’s then you have to use the equivalent wglSetDeviceGammaRamp3DFX. Basically, if you can change the gamma of your desktop, then I think it should work,

I don’t know about Linux ( one must exist in X-Windows I guess ).

void GenerateGammaCorrectionTable(double gamma, unsigned char gtable[256])
for (int i=0; i<256; i++)
double y = (double)(i)/255.0;
y = pow(y, 1.0/gamma);
gtable[i] = (int) floor(255.0 * y + 0.5);

You might want to generate a different table for R, G and B.

Here’s my code for vanilla gamma correction:

set_screen_gamma(float value)
	ushort ramp[256*3];

	float exponent=1.0f/value;

	for (uint i=0;i<256;++i)
		float linear=float(i)*1.0f/255u;
		float corrected=(float)pow(linear,exponent);
		ushort entry=ushort(corrected*65535);

        HDC screen=GetDC(NULL);

A few things to note:
1)There are actually three gamma ramps, for red, green, blue in that order.
2)The ramps are stored consecutively (ie non-interleaved) as 256 unsigned shorts with full 16 bit range.
3)You don’t need DX, SetDeviceGammaRamp is a GDI function, and yes, it affects everything
4)You can also get the current gamma ramp with - tada - GetDeviceGammaRamp. This way you can avoid messing up settings when your app closes.

[This message has been edited by zeckensack (edited 08-26-2002).]

[This message has been edited by zeckensack (edited 08-26-2002).]

The functions XF86VidModeSetGammaRamp and XF86VidModeGetGammaRamp can be used with the xfree86 x window system to set and get gamma.

[This message has been edited by Nakoruru (edited 08-27-2002).]