I needed the accuracy of a floating point fbo when I do ping ponging with opengl – otherwise there were some strange patterns forming. And it works on my Radeon HD 2600 XT and NVidia GF 8800 GT. Ping pongs about six times. ssao, bloom and depth blur.
Anyhow, I run the exact same thing on the GeForce 7300 GT. I check for extensions like “GL_ARB_texture_float”. But the card indicates it can handle it. I only use 4 color attachments and I do not think this violates GL_MAX_COLOR_ATTACHMENTS_EXT which is 4. Also, I checked GL_MAX_TEXTURE_SIZE and the dimensions of the buffer are less than 4096. The depth is GL_DEPTH_COMPONENT24 after checking with glGetIntegerv on GL_DEPTH_BITS.
I chose an internal format of GL_RGBA16F_ARB – is there a way to check support for this?
So, it passes all these checks. Yet, the moment I start to write with my shader into the fbo, glBegin(GL_POLYGON) doing the ping pong, OpenGL Profiler indicates that the program has reverted to software mode and the program runs creepy slow. Unlike the two other cards.
Think the Radeon 9800 also says it can do it too.
I suppose its about turning this feature off, if the card cant do it.