Additive blending not working as expected

I’m trying to set the blending mode to additive blending using the following calls:

glBlendEquation(GL_FUNC_ADD)
glBlendFuncSeparate(GL_ONE, GL_ONE, GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA)

What I’m trying to achieve in the end is additive blending on colors, and normal alpha blending on the alpha channel. However, instead of my expected texture, I get a big white rectangle.


The expected result is something like this (using normal alpha blending):
image
My texture is completely white (the RGB channels are all 1.0), and the alpha channel is a plot of the values of a 2D gaussian kernel. It looks like this:
image

I have double checked that the texture is bound properly using qapitrace, works as expected. It also works as expected with normal alpha blending

glBlendEquation(GL_FUNC_ADD)
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA)

as presented on the image of my expected result. Is my blending setup wrong?

Well, additive blending means that the result will be at least as bright as either source. So if you start with an all-white texture, you’re going to end up with an all-white screen, regardless of what else you add to it.

Also: why do you care about the blending for the alpha channel when you aren’t even using the alpha channel?

Do you understand that the alpha channel doesn’t mean anything unless you use a blending mode where you make it mean something? I.e. if you scale either the source or destination colour by the source or destination alpha or its complement.

In this case, I wanted it to mean the amount of color added. Thanks to some external help, I managed to solve it using glBlendFunc(GL_SRC_ALPHA, GL_ONE).