capturing frame buffer w/o the rendition of the image on the o/p window

I have this problem of capturing the frame buffer in the array without the output being actually visible on the output window. I could capture the image in the frame buffer but i don’t want the image to be displayed. This is because the image is being covered by the other windows on the computer terminal. I am caputuring the image in the frame buffer and converting it to a jpeg file and displaying that jpeg file. But this way whenever an image is being rendered on the screen, whatever other windows are there on the screen, they too are being captured by my frame buffer capturing array and hence my jpeg images are containing some unwanted parts too.
I hope someone can help me with this.
DoloMighty thanks in advance…


ops… this was for dmy…
Sorry, I am Paolo, but I think I could help, anyway.
So actually you are capturing the screen, kind-of. I mean you are capturing physically what is seen by the user in front of the screen, right?
What you want to do, as I understand, is capturing the whole client area of the window in which rendering occurs.
You should create a double-buffered window, render the scene and capture the back buffer.
For this, you set the current buffer used for reading, with glReadBuffer(GL_BACK), then render scene, wait for completion calling glFinish, read the content with glReadPixels. Remember to set up correctly pixel transfer parameters, because also when reading pixels data pass through the pixel transformation pipeline.
Note: if you don’t swap the buffers, you will not see the rendered image on the window.

[This message has been edited by paolom (edited 03-17-2000).]