Just read Spec 2.1, page 211:
“BlendFunc argument src determines both RGB and alpha source functions, while dst determines both RGB and alpha destination functions.”
Suppose you keep the original glBlendFunc, (in addition, RGB and Alpha BlendEquations are set to FUNC_ADD (initial values) ):
R_d is the red component initially in the colorbuffer.
R_s is the red component of the fragment.
R is the red component at the end of the blending in the colorbuffer.
(I skip the green and blue component because this is exactly the same equation)
I use the same notation for alpha: A_d, A_s an A.
(BTW, this is the notation of table 4.1 on page 210 of Spec 2.1.)
R=R_s*(A_s)+R_d*(1-A_s) // OK (on the table, S_r=A_s and D_r=1-A_s)
A=A_s*(A_s)+A_d*(1-A_s) // first term is A_s^2 ! (on the table, S_a=A_s and D_a=1-A_s)
ie the wrong thing…
With
glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA,GL_ONE,GL_ONE_MINUS_SRC_ALPHA);
you get the same equation for rgb, but for alpha:
A=A_s*(1)+A_d*(1-A_s) // on the table, S_a=1 and D_a=1-A_s
ie no more A_s^2.
R or A are what you have at the end in the FBO. Then, I guess “yackies” renders the colorbuffer in the default framebuffer by drawing a full quad with the colorbuffer of the FBO as a RGBA texture. This is when the problem is visible if you use glBlendFunc() only, because the alpha of this texture is wrong.