Hi folks !
I’m developing (with Visual Studio C++ 2005) a screen saver (yet an other one, sigh…) that is build on plugins :
The main part (.exe) manages all the windows/display/events stuff.
The plugins (.dll) produce the bitmap to be displayed (I already made some nice ones).
A plugin has some entry points as “init” (that is called once, at beginning) and “next” (that is iteratively called until the mouse is moved).
Now I’m trying to create a new plugin (ie dll) that uses OpenGL to compute 3D images.
This plugin does not manages its own display : I don’t use glutMainLoop, I just get back the bitmap to display it by my own.
In the “init” function I init all the OpenGL stuff :
. glutInit( &argc,argv ) ;
. glutInitDisplayMode( GLUT_DOUBLE | GLUT_RGBA | GLUT_DEPTH ) ;
. glutInitWindowSize( sizex,sizey ) ;
. glutCreateWindow( “dummy” ) ;
. glClearColor( 0.0f,0.0f,0.0f,1.0f ) ;
. glShadeModel( GL_SMOOTH ) ;
. glFrontFace( GL_CW ) ;
. glCullFace( GL_BACK ) ;
. glEnable( GL_CULL_FACE ) ;
. blah blah…
In the “next” function I render the scene and get the bitmap :
. glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT ) ;
. glLoadIdentity() ;
. glBegin( GL_QUADS ) ;
. glVertex3d …
. blah blah…
. glEnd() ;
. glFlush() ;
. glutSwapBuffers() ;
. glutPostRedisplay() ;
. glReadPixels( 0,0,sizex,sizey,GL_RGB,GL_UNSIGNED_BYTE,bitmap ) ;
So… here is the issue :
In the “next” function, nothing works… the whole OpenGL context (ie state) seems to be lost.
If I hack my code and call directly “next” from “init”, it works fine.
All my code is mono thread.
It is like if returning from a DLL function makes OpenGL lose its context.
I don’t know about calling a DLL (OpenGL32.dll) since another DLL (my plugin)…
Do you have any idea on how OpenGL manages its own context ?
I thought that since my DLL is loaded in memory, its own DLLs are also maintained in memory with their own context… but I may be wrong…
Thanks for any advice.