I am implementing deferred shading and tring to blend the results of each light into an FBO. My goal is to end up with an FBO that contains the contribution for each light in the scene.
The problem I am having is when I turn on blending, it seams that everything runs in software… I get below 3 FPS…
Is it possible to do GL_BLEND + glBlendFunc(GL_ONE, GL_ONE) in FBO’s??
Sounds like you are using blending with a floating point render target on a hardware that doesnÂ´t support it.
nVidia supports this since the 6000’ series, ATI since the Xxxx series.
so my Quadro FX 4000 cant do it?
Is there any way to detect this programatically?
nVidia used to add “ForceSW” (or something similar) to the value returned when you got the Vendor string using glGetString(). I don’t know if they still do this.
It seems, the Quadro FX4000 is based on Geforce6800-like chips (NV40), so your card should at least support blending on FP16 render targets. Check, if you are maybe using FP32…
yeah… I was careless and was using FP32. Got it now.
Thanks for the tip on detecting floating-point blending!