I am currently implementing various post-effect shaders in OpenGL (depth of field, bloom effect). So far, I was rendering quads on screen to apply all my post effects, but I would like to use glRect() instead (faster in theory and avoid interlation artefacts?).
My problem is that when I was using glBegin(GL_QUADS) I could also send uv-coordinates for each vertex of the quad. But it seems impossible using glRect()!
I could recompute the UV coords from the normalized-device-coordinates of the vertex in the vertex shader, but this causes the framerate to drop on my (old radeon 9600!!!) graphic card from 70 to 60 fps.
So, does anybody know a way to send vertex attributes (like uv coords) when calling glRect()?
I highly doubt that using glRect is going to give you any speed increase at all.
We are talking single quads here that are over the entire screen so it is not really going to matter how you supply the vertex data. (You are going to be fill-rate limited)
What “interlation artefacts” are you seeing?
I mean “interpolation artefacts” sorry.
I have been implementing the same effects on another system (not OpenGL related), and got wierd artefacts when I was rendering two triangles on the screen. Basically you could see a diagonal seam running accross the screen. Rendering using a rectangle primitive resolved the problem. Furthermore, viewport clipping were also disabled, and that effectively improved the framerate.
Does glRect() could have the same benefits on my application or am I missing something?
That seam your describing is caused by the combination of two things, nearest filtering and a non rectangular quad.
If you can manage to get the quad perfectly rectangular by using lets say glOrtho that diagonal seam will no longer be there.
and if you are using linear filtering then it will never appear at any time.
and besides glRect is the same thing as drawing a polygon with glBegin/glEnd, it’s just another way of doing it.