OpenGL loses its context when called from a DLL (?

Hi folks !

I’m developing (with Visual Studio C++ 2005) a screen saver (yet an other one, sigh…) that is build on plugins :
The main part (.exe) manages all the windows/display/events stuff.
The plugins (.dll) produce the bitmap to be displayed (I already made some nice ones).
A plugin has some entry points as “init” (that is called once, at beginning) and “next” (that is iteratively called until the mouse is moved).

Now I’m trying to create a new plugin (ie dll) that uses OpenGL to compute 3D images.
This plugin does not manages its own display : I don’t use glutMainLoop, I just get back the bitmap to display it by my own.
In the “init” function I init all the OpenGL stuff :
. glutInit( &argc,argv ) ;
. glutInitDisplayMode( GLUT_DOUBLE | GLUT_RGBA | GLUT_DEPTH ) ;
. glutInitWindowSize( sizex,sizey ) ;
. glutCreateWindow( “dummy” ) ;
. glClearColor( 0.0f,0.0f,0.0f,1.0f ) ;
. glShadeModel( GL_SMOOTH ) ;
. glFrontFace( GL_CW ) ;
. glCullFace( GL_BACK ) ;
. glEnable( GL_CULL_FACE ) ;
. blah blah…

In the “next” function I render the scene and get the bitmap :
. glLoadIdentity() ;
. glBegin( GL_QUADS ) ;
. glVertex3d …
. blah blah…
. glEnd() ;
. glFlush() ;
. glutSwapBuffers() ;
. glutPostRedisplay() ;
. glReadPixels( 0,0,sizex,sizey,GL_RGB,GL_UNSIGNED_BYTE,bitmap ) ;

So… here is the issue :
In the “next” function, nothing works… the whole OpenGL context (ie state) seems to be lost.
If I hack my code and call directly “next” from “init”, it works fine.
All my code is mono thread.
It is like if returning from a DLL function makes OpenGL lose its context.
I don’t know about calling a DLL (OpenGL32.dll) since another DLL (my plugin)…
Do you have any idea on how OpenGL manages its own context ?
I thought that since my DLL is loaded in memory, its own DLLs are also maintained in memory with their own context… but I may be wrong…

Thanks for any advice.


Lemme get this straight…

You have a plugin interface through which addons recieve a message from your framework to render a frame into a bitmap that your framework later presents?

P.S. This should just work, unless your plugin is doing something silly and botching GL state somehow.

Does your main app use an OpenGL context? When a DLL is loaded, it becomes a “part” of applicatinon, sharing the memory and data. Therefore, a DLL cannot have it’s “own” rendering context, but will create it relatively to the application. Just visualise a DLL like another set of compiled source files, which are linked at runtime instead of compile-ime.

Thanks guys !

You’re both right !
It works as it should !
But I don’t… I forgot that I added some mutli-thread stuff to perform some bench…
I deleted all this code, get back to mono-thread context and all get fine !
I just wonder why… threads share their memory and (as the code was designed) there were no risk to simultaneous access to memory… by way of proof, my other plugins worked very well in that mutlithreaded context.

By the way, since I don’t use OpenGL’s window managment, do you know if is possible to avoid using glut ?

Thanks a lot


There is no such thing as “OpenGL’s window management”. GLUT is just a multiplaftorm library that hides the OS-specific windowing code. On windows, use the native win32 api

Because an OpenGL context can only be current to a single thread. GLUT initialization will create a GL context and bind it to the current thread. If you then call GL entry points in another thread they won’t use the same GL context.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.