AGL Shared Context

I have the following code to create a shared rendering context (in order to use it in another thread) working perfectly on Linux (and also have the M$ version of that one too up and running…):

#ifdef __linux__

	static Display *dpy = NULL;
	static int scr = 0;

	GLXContext GetCurrentContext( GLvoid )
	{ return glXGetCurrentContext(); }

	GLXContext CreateContext( GLXContext _ctx )
		XVisualInfo *vi;
		XVisualInfo tmp;
		int nvis;
		long mask;
		dpy = glXGetCurrentDisplay();
		scr = glXGetCurrentDrawable();
		vi = XGetVisualInfo( dpy, mask, &tmp, &nvis );
		return glXCreateContext( dpy, vi, _ctx, GL_TRUE ); 

	GLvoid SetCurrentContext( GLXContext _ctx )
	{ glXMakeCurrent( dpy, scr, _ctx ); }
	GLvoid DestroyContext( GLXContext _ctx )
	{ glXDestroyContext( dpy, _ctx ); }

Now I im trying to port that on MAC, I have to mention that Im using GLUT on MAC (and freeglut on Linux/M$) for portability convenience. Now I manage to produce this code:

#ifdef __APPLE__

	AGLContext GetCurrentContext( GLvoid )
	{ return aglGetCurrentContext(); }

	AGLContext CreateContext( AGLContext _ctx )

		GLint attribs[] =
		AGLPixelFormat pxf = aglChoosePixelFormat( NULL, 0, attribs );
		return aglCreateContext( pxf, _ctx );  

	GLboolean SetCurrentContext( AGLContext _ctx )
	{ return aglSetCurrentContext( _ctx ); }

	GLvoid DestroyContext( AGLContext _ctx )
	{ aglDestroyContext( _ctx ); }

Now… with the code above, I DO manage to create a AGLContext, however because of the pixel format my context don’t share lists, texture, vbos etc… (at least I guess that’s why.)

My question is:

#1. How can I manage to create a shared AGLContext even if Im using GLUT (cuz I got it up and running on other platforms)?

#2 . If #1 == “Yes”: How can I get the pixel format of another AGLContext, since AGL doesn’t provide no function whatsoever to get another pixel format?

Tks in advance,


Ok forget about my last post, GLUT use CGL to create the window and this is why my AGLContext did work… but now the question now the same as this guy in the link below: “How to retrieving pixelformat in CGL.”

Any ideas?, cuz nobody reply that post… and Im scared :wink:

Ok now it’s getting ridiculous :wink:

Once again to auto-answer my own question:

#ifdef __APPLE__

	CGLContextObj GetCurrentContext( GLvoid )
	{ return CGLGetCurrentContext(); }

	CGLContextObj CreateContext( CGLContextObj _ctx )
		CGLContextObj tmp;
		CGLPixelFormatObj pxf;
		GLint n_pxf;
		CGLPixelFormatAttribute attribs[] =
		{ (CGLPixelFormatAttribute)NULL };
		CGLChoosePixelFormat( attribs, &pxf, &n_pxf );
		CGLCreateContext( pxf, _ctx, &tmp );
		return tmp;

	GLboolean SetCurrentContext( CGLContextObj _ctx )
	{ return CGLSetCurrentContext( _ctx ); }

	GLvoid DestroyContext( CGLContextObj _ctx )
	{ CGLDestroyContext( _ctx ); }

On MAC it seems that the pixelformat have no impact on a CGLContext so the code above work just fine. However, maybe having kCGLPFAAccelerated inside the attribs MAY insure that the context will be accelerated, but I didn’t see any performance issue related to that. Anyway the code above work just fine with pthread :wink:

On Leopard you can call CGLGetPixelFormat(CGLGetCurrentContext()) to retrieve your GLUT window’s pixel format. On earlier OSes you cannot retrieve the pixel format for the context, so you must fall back on attempting to manually construct a pixel format that can be shared with GLUT. Looking at the GLUT source may be instructive in making this work (make sure you test systems with multiple video cards from multiple vendors).

GLUT is not intended for this kind of thing!

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.