In OpenGL there are functions for gamma correction.
However if you have different sources and want to blend stuff with each other this isn’t very handy.
It would be useful to, for any 1D,2D,3D,… texture/array be able to add a gamma value.
Thereby making it possible to easily make an algorithm to sort out mixing (addition, multiplication,etc…) of different textures/arrays automatically without extra efford on the user part.
(Here a programmer writing software for graphic operations is the user.)
Of course it will need some rules such as:
When adding two textures of different gamma value. The texture who’s gamma value is most different from gamma==1 would be converted to the closer gamma value of the other.
Then both can be used in a mathematical expression without distortions because of different gamma values.
I don’t know about the other api’s but the following functionality would be very handy:
Being able to let the graphic card ask the gamma value of the display devices being used and adapt to it automatically.
This of course asks for connections that offer doing this.
Please give some feedback if this has already been done or not and if it’s a good idea/way-of-doing-things or not.