Inverted reverse subtractive blending

Greetings all,

as a fairly new user of OpenGL I’ve got a question.

Reverse subtractive blending can be done by:


which would be something like

new = dest-src

But what I would like is the src inverted first:

new = dest-(255-src)
new = dest-255+src

I also tried using glLogicOp() with GL_COLOR_LOGIC_OP enabled, but that seemed to completely override the blending.

Perhaps the best way is to first do a reverse subtractive blend on a white background and reverse subtractive blend that result, but I’m unsure how to that. What do you people think?


Can’t you just do (1.0 - resultColor) at the end of your shader?
(if not using shaders, use a glTexEnv call to invert the resulting color)?

Maybe just setting blend function like this:


Not using shaders indeed, that’s for later. :slight_smile:
What parameters should I give glTexEnv()? It looks like there’s not much choice (GL_MODULATE, GL_DECAL, GL_BLEND or GL_REPLACE). I don’t know if it matters, but I’d like the required openGL version to be as low as possible and I’m at 1.4 now, because I’m using glBlendEquation().

That doesn’t seem to have the wanted effect, I’m afraid.

If not directly, perhaps it’s possible using a color buffer, in the way described earlier. I read a bit about GL_AUX color buffers and such, but I’ve been unable to use it successfully. What I did was:

int buffers = 0;

//.. binding and drawing of texture


but this didn’t do anything noticeable.

On glTexEnv:

Using the “GL_COMBINE” option - which is a kinda early form of “shaders” in OpenGL 1.3.

You need two texture stages typically (unless you don’t have textures)

You will have to setup the last texture stage to do a GL_REPLACE with a GL_ONE_MINUS_SRC_COLOR thrown in at on the color op.

Really, this is complex stuff to figure out - but it is possible. (I previously used a macro language that translated it to these texture stage states) Perhaps you can find an old tutorial or example code on these forums or web? (I really don’t feel like figuring out this math currently)

Here is some code that will give you a feel for what it should look like: (math it not correct - just cut and pasted from a website)


Alright, thanks for all the info. I’ll try it out. :slight_smile:

Ok it worked with the following code:

//.. draw
//.. reset texture environment to default

I didn’t use glActiveTexture() though, it didn’t seem to have an effect. I don’t fully grasp that function either.

Thanks again. :slight_smile:

glActiveTexture activate a texture unit. It is used bound several textures at once.
If you never use this function, textures are always bound to the texture unit 0, other texture units are not active

So I can bind multiple textures to a single texture unit and give this texture unit environment setting, which the textures bound to that texture units will ‘use’ when drawn? That’s mighty handy, thanks.

So I can bind multiple textures to a single texture unit

No, I will reformulate my answer.
before the ARB_multitexture extension you were able to bind only one texture per primitive but now, you can bind several texture per primitive attaching them to different texture units. So there is only zero or one texture bound to a texture unit.

In your case, if it works without multitexturing, this is not useful.

I think I get it, thanks. :slight_smile:

Okay it’s not entirely working, sadly. The problem is that the ‘inverted’ reverse subtractive blending only works normally ‘once’. These pictures explain it better:
Drawing once:
Drawing multiple times:

Code used to get this result:

glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_RGB, GL_ONE_MINUS_SRC_COLOR); // 'inverted' reverse subtractive blending
//glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_RGB, GL_SRC_COLOR); // normal reverse subtractive blending

If I invert the image using MS Paint before running, the results are reversed (left becomes right and right becomes left), which seems logical.
Even after extensive thinking and tinkering about, I don’t quite grasp what’s going on here. Got an idea? :slight_smile:

No one? I’m really stuck. :frowning:

I can’t help you, I have never really understood this stuff. If your hardware support it, use shaders, it would be then, very very easy to do what you want.
And don’t worry, shaders are not as complicated as it seems to be, at least for what you need to do right now. :slight_smile:

Looking at your posted screen shots, I do not understand why the left square in the second image went black and why you thought this was OK.

(Also how many times did you render for “multiple times”)

The only reason I can think of is you are not using a pure blue image.

eg. for pure blue

Dst = (1,1,1)
Src = (0,0,1)

reverse subtractx1 = (1,1,0)
reverse subtractx2 = (1,1,0)
reverse subtractx3 = (1,1,0)
reverse subtractx4 = (1,1,0)

not using pure blue

Dst = (1,1,1)
Src = (0.1,0.1,1.0)

reverse subtractx1 = (0.9,0.9,0)
reverse subtractx2 = (0.8,0.8,0)
reverse subtractx3 = (0.7,0.7,0)
reverse subtractx4 = (0.6,0.6,0)

reverse subtractx10 = (0,0,0)

However, for inverted reverse subtract an impure blue would do this:

Dst = (1,1,1)
Src = (0.1,0.1,1.0) -> (0.9,0.9,0.0)

inv reverse subtractx1 = (0.1,0.1,1)
inv reverse subtractx2 = (0.0,0.0,1)
inv reverse subtractx3 = (0.0,0.0,1)
inv reverse subtractx4 = (0.0,0.0,1)

This seems to be the case as in your first image the first inverse reverse subtract still has some red and green in it.

I am curious why you think calling it multiple times should remove the blue? (unless you are doing something with alpha?)

Sorry, the actual color of the subtractive blended square is RGB(36,65,240), I just picked a blue-ish colour, randomly. :slight_smile: This matters a lot indeed, because if it’s inverted or not, the RGB components are !=0. Thanks for pointing that out.

So I’m using a nonpure blue indeed and that’s why, like you say after drawing many times it should become black eventually. This is exactly the reason I think calling it multiple times should make the image black, because it’s still reverse subtractive blending, but with the source inverted. The source is not 1, so the inverted source should not be 0.

But maybe the 240 gets ‘rounded’ somewhere to 255, because when I use an RGB(128,128,128) square, after two inverted subtractive blends, the image is black like it should! The same happens with normal reverse subtractive blending, which is logical.
So the question remains: why was the blue component 240 seen as 255? Because when I printscreen the nonpure blue square from my opengl window, it says it’s RGB(34,68,255). Good call sqrt[-1]! :slight_smile:

Now there are tons of places where this conversion could have happened, as I’m using SDL with SDL_image and convert the images to suit opengl. Maybe it’s a 16bit thing, I’ll check it out. OpenGL says the window is 24 bits with 8bits for all of the RGBA compontents (checked using SDL_GL_GetAttribute()) and my desktop is in 32bit mode as well. I make my texture using

glTexImage2D(GL_TEXTURE_2D, 0, depth/8, width, height, 0, format, GL_UNSIGNED_BYTE, data);

and depth is checked to be 32, so I don’t think it has to do with openGL, but with me and SDL, so I’ll continue messing with that.

Thanks a lot for your help, you guys, much appreciated!

I know what this problem is - do not use this code:
glTexImage2D(GL_TEXTURE_2D, 0, depth/8, width, height, 0, format, GL_UNSIGNED_BYTE, data);

As the third parameter taking 1, 2, 3, or 4 is more or less deprecated. You see that when you specify 1,2,3 or 4 you are saying to OpenGL - “give me a (1/2/3/4) component texture of any format”

It is know on ATI that they give you 16 bit textures always (you are on ATI right?) and Nvidia give 32 bit textures. (both are OK according to the spec).

The modern way (since OpenGL 1.1?) is to specify the format you want.

eg. Update your code to be like this:

GLenum parameter = GL_RGBA8;
if(depth == 24)
parameter = GL_RGB8;

glTexImage2D(GL_TEXTURE_2D, 0, parameter, width, height, 0, format, GL_UNSIGNED_BYTE, data);

Note that you could just use GL_RGBA8 always - the driver will automatically fill out any missing fields (and do data conversions if the data was not in ubyte format).

(Oh and the reason I know this feature/bug is that a few commercial engines I have worked on had this problem - and it was mistakingly attributed to bad ATI drivers)

Oh and one more thing - the desktop depth resolution should have no bearing on the texture format you select)

Ah thanks! Good I posted that texImage call. I’m on an ATi indeed (Radeon 9800XT). It works brilliantly now. :slight_smile:

I posted the desktop resolution because I thought the depth of the framebuffer would always be equal to the desktop resolution in Windowmode, so if I had a 16bit desktop, the framebuffer would be 16bit also and with printscreen I would get non 32bit results as well.

Thanks again.