what i mean is: Let’s say I have a simple cube rotating in OpenGL perspective that has black background and I wan’t to make a picture of this cube every 10 seconds and save it as jpg in such way that I get raw picture data about the cube, about how it looks like on the screen without the black background (well actually it would be ok also if the background would be black).
I hope I was able to express myself good enough. Feel free to tell me what would be a shorter way to ask this :).
Is it possible to get screen data in such way that I described?
Is there like some buffer that holds it every frame and I can somehow get it from there?
Here is a step-by-step overview, you can search the web or ask again on this thread if you need details for each step :
make sure you ask for an RGBA framebuffer, you will need the Alpha channel for transparency (for the “like on the screen without the black background”)
glClear with default clearcolor, with is black and transparent, just like you need
render your scene
before swapbuffers, use glReadPixels to copy the rendered image from the framebuffer to an array within your program
now you can swapbuffers if you also need to directly see the render, but it is not needed for the image file
from the raw pixel data, use any image load/save library (like Freeimage) to store it in a file. But watch out ! Jpg only store RGB values, so instead use other formats like .png (better) or .tif or .tga : each can store the full RGBA, with transparency.
if you use antialiasing and/or blending on the rendering, for best transparency results on the stored image, you will have to use it as “premultiplied alpha”, but it is probably not important yet.
Thank You! I shall try it out. Btw cool name ZBuffeR.