glColorTableEXT Performance

Is this function highly deprecated and or inefficient?

I want to implement oldskool-style palette cycling effects and so far this is the way I’ve been using, but I don’t want to continue using it if it’s madly bad.

So far the problems I can see involve having to bind every texture that uses the palette colors. I would probably have to do this by once per frame to reflect the cycled maintaining a list of which texture names are paletted textures and then iterate through them once per frame.

Here is a discussion on paletted textures. glColorTableEXT is part of the GL_EXT_paletted_texture extension, and towards the end, I found out that it is being deprecated.

I would also like to know how to do fast palette animation.

Does anyone have an idea “why” it is so difficult (or undesirable) to include hardware support for paletted textures?

Never mind about “why”, if had read more I would have found this:

Palletted textures will not be supported on GeForceFX. While the functionality obviously has uses, it consumed a disproportionate amount of real estate relative to the number of applications that made use of it.

I guess one way to convince them to add the extension is to write a “killer” app that emulates in software (although slowly) to show that it is worthy.

Back to the original post, I will try to get back to you on how to write a fast software implementation when I get time (and find out how).

The only solution I can come up with is entirely dependent on what you plan to do. You could try storing all texture variations in the graphics ram, or if your textures are small (and don’t change many times per frame), you could perform an update every frame.

This is most likely beyond OpenGL and I am not sure I know what I am talking about, but is it possible to interpret the framebuffer with another API that supports palettes. For example, render in OpenGL to an offscreen buffer, then using another API, render the offscreen buffer to the screen performing palette operations.