Can I directory draw GL scene in memory?

Hi, erveryone.

Resently I encounter a problem. I want to draw some OpenGL scene (convert to raster form and write) into a file.

I sometimes work in the Windows platform , and sometimes in the UNIX platform. In either case I can easily write some “copy screen” codes, but then I should do this two times for my applications, so I hope I can find a way to directory raster the GL scene just using memory and platform independent, i.e., I want to avoid platform dependent handles.

Can anyone give me some ideas? thx.

[This message has been edited by kull (edited 12-22-2000).]

You can use a set of functions that read and write pixeles from and to one of the buffers associated with a rendering context depending on the curent raster position. After selecting the buffer you want, using glReadBuffer and glDrawBuffer, you can use either glReadPixels, glDrawPixels and glCopyPixels. You control the process of reading and writing with glPixelStore. You change the raster position using glRasterPos.

First thanks your help.

I still have some querstions.

we know when we use OpenGL libs, we must first create a window ,then draw something in it, whatever in Windows or in X. So the traditional demos always can work ( save the rendering images by reading the pixel informations ) . But this time I hope my program can avoid windows creating and directory do every thing in memory and then output the image data, can I do it ? Or OpenGL really should do some extension ? Or I must hack the MesaGL source code to achieve my crazy aim?

In OpenGL, creating the window is left to the Windowing system installed on the computer, and you can’t create it in a platform independent way. However, retrieving pixels and data from the frame buffer (pixels, colors, …etc.)can be done so (read my previous post). If you want to do the hall thing in a platform independent way, I think you don’t have any choice other than programming with Java and Java 3D API.

You can use glut for creating windows in a way that is not platform specific. If you want to do your thing in memory without actually opening a window, try initializing glut, but don’t ever use the glutMainLoop() function. Not sure if this will work for what you want, but it might be worth a try. I did something like this to print out a list of extensions under Linux.

Right Deiussum! I didn’t work with GLUT before, but after I expertised with it today, I found that it’s really a great sollution if one doesn’t want to do some Java. However, it has one draw back: You can’t mix between different controls in the same window, i.e you can’t place a rendering context next to a button or check box or …etc. So, you are forced to write MDI application. Am I right?

Yeah, if you want to add controls to the window, you won’t be able to use Glut. You don’t need to make it an MDI, however. I don’t think kull was looking to use controls, though. It sounded to me like he just wanted a way to dump some openGL stuff to a file without ever showing the window.

[This message has been edited by Deiussum (edited 12-27-2000).]

Why dont u draw ur glcodes into memory directly? i dont know how to do it under unix, but i know it’s easy to render a memory based glcodes in windows system , u need not create a window, i have a CMemoryGL class, if someone wants to get it, mail to me:

Oh, I must apologize, for English is not my first languag, so I can’t express my idear clearly. Initialy, I got this chanllenge when I wanted to build a php module that can export my original X/Windows OpenGL programs’ pictures through webserver. But at that case I have no window handle, . And truely I have many OpenGL things which should be done on the both platforms, it always took me many times in porting. So can OpenGL do this low-level platform-independent things? The direct way is to do every thing in memory, I think, it would be a good way for me. But perhaps this is the wrong way for OpenGL. I really should do some hacking deep into the MesaGL code maybe.

[This message has been edited by kull (edited 12-29-2000).]

I tried to post this message once before, apparently it didnt go through…

So… I dont have an answer to Kull’s question, but I do think I understand what he’s saying.

He wants to render his scene into an offscreen buffer and save it out as a picture. (Am I right on this?). I beleive you can do this with OpenGL, I just dont know how.


Originally posted by BwB:
[b]I tried to post this message once before, apparently it didnt go through…


I think it ended up in another thread by mistake. I was reading a thread and saw your response to Kull right in the middle of it and wondered what it had to do with the thread I was reading. =)

I answered a question similar to this a while back. See this topic: