wglMakeCurrent leaking?


I hope someone can help with this…

I’ve developed an MFC app that renders to 4 different GL windows. (I think) I’ve done the usual thing in that I get the current rendering and device context, then call make current for the object I’m rendering, render the object, then restore the original rendering and device context.

My problem is a memory leak unless I comment the code to restore the context. My question is what could be the cause of this, do I need to look at the graphics driver (Mobility Radeon 9600)?

Any help appreciated.



void CTraceGraph::OnPaint()

	CPaintDC dc(this); // device context for painting
        // Store current rendering and device contexts
	HDC hDC = wglGetCurrentDC();;
	HGLRC hRC = wglGetCurrentContext();

	wglMakeCurrent(m_hDC, m_hRC);						// Make view's rendering context current

	DrawObject();       // Now render the targets 

	if (hDC != NULL && hRC != NULL)
		wglMakeCurrent(hDC, hRC);					//Restore last rendering and device contexts


Try getting rid of the CPaintDC dc, it could be the mix of gdi and gl tthat’s freaking things out, I’ve seen it before.

If that doesn’t work, try an override of the PreCreateWindow method to make sure each window has the CS_OWNDC class style set. I’ve seen this fix some problems as well.

BOOL MyClass::PreCreateWindow(CREATESTRUCT& cs){


cs.lpszClass = AfxRegisterWndClass

return MyBaseClass::PreCreateWindow(cs);


If you get rid of the CPaintDC, don’t forget to call ValidateRect(NULL) to validate your window. (CPainDC would otherwise does this for you automatically.)


Thanks for that.

Removing the CPaintDC this did not work, still leaked. The PrecreateWindow think just introduced flicker…

Hmm, flicker must be because double buffering wasn’t working. Meaning that I need to make sure that SwapBuffers() is getting an alternative device context for swappings…

I need to look at this a little more closely.

Thanks for your help.


One minor thing to try: It’s good practice to restore the current DC/RC before leaving the paint function, but what happens if you just make the cached DC/RC current before drawing and ensure that any other GL drawing does the same?

[Your app might not interop with other openGL apps that don’t makeCurrent every frame, but it would help isolate the leak, IMO.]

This all gets painfully worse with .NET, I’m afraid. I just went through a similar run, but it was the DC/RC objects themselves that were piling up.

Why don’t use only one RC and bind it to all DC’s when you need. You have to setup all DC’s to same pixel format and wglMakeCurrent will work as well.


Guys, thanks for the response.

Removing the code to restore the previous RC/DC is fine so long as I am updating only one window, the leak stops. But as soon as I mouse over one of the other windows (which causes invalidation of the window) the leaks start again.

This is making me think that yooyo’s idea may lead the only solution. I’d thought about doing that but was reluctant as I’d need to restructure things a little.

What I don’t understand is that the get RC/DC, make current, render, restore RC/DC is pretty fundamental stuff. I’ve not done anything out of the ordinary…could it be an XP or graphics driver thing, or maybe a problem with my platform only.

Whatever, I’ll go with the single RC thing to see what results I get.

Thanks again,



just so you know, I’ve implemented multiple viewports using MFC on XP, and never had any trouble, each one with its own dc/rc. As you said, it could simply be the card, drivers, or your laptop. It could even be something else in your code that we haven’t seen yet, but surely not. :slight_smile:

While the suggestions given are good ones, they should not be necessary for normal operation, as you quite rightly pointed out. I smell a driver issue, but I can’t be sure.

OK, now I’m really confused. I’ve spent the morning adapting the GL window classes so that they will adopt a GL RC if provided, or create their own if not. But the leak will not go away.

At least the code is a little nicer now that I’ve given it some attenation and I’ve fairly satisfied that what I thought I knew about GL was correct.

I’m going to try a different platform later, but I’ll have to do some real work for a while first…


What kind of leak are you talking about. We can see in our apps that there are sometimes “leaks” when you upload new textures that get managed by the OpenGL drivers but this data is resued or restored by the system later on.

This is however presented in BoundsChecker and Purify as a leak at glmakecurrent …

See if it is related to a texture size …

Originally posted by Cyranose:
[Your app might not interop with other openGL apps that don’t makeCurrent every frame, but it would help isolate the leak, IMO.]
Nope, that’s safe. The current GL context is per-thread state, so naturally everything happening in another process is isolated as well.

… if the drivers work as they should :wink:


I have the same problem with my Radeon 9600, WinXP, last drivers (4.8 - GF 5700 doesn’t leak. The following minimal code leaks around 500k per second :

#include <windows.h>
#include <gl/gl.h>

int WINAPI WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPSTR lpCmdLine, int iCmdShow)
  HWND hWnd = CreateWindow("STATIC", "wglMakeCurrent leak on ATI ?", 
		NULL, NULL, hInstance, NULL );
  ZeroMemory( &pfd, sizeof( pfd ) );
  pfd.nSize = sizeof( pfd );
  pfd.nVersion = 1;
  pfd.iPixelType = PFD_TYPE_RGBA;
  pfd.cColorBits = 24;
  pfd.cDepthBits = 16;
  pfd.iLayerType = PFD_MAIN_PLANE;
  HDC hDC = GetDC( hWnd );
  SetPixelFormat( hDC, ChoosePixelFormat( hDC, &pfd ), &pfd );
  HGLRC hRC = wglCreateContext( hDC );
  for (unsigned i = 0; i < 500000; ++i)
    wglMakeCurrent(hDC, hRC);
    wglMakeCurrent(NULL, NULL);

  wglDeleteContext( hRC );
  ReleaseDC(hWnd, hDC);
  DestroyWindow( hWnd );
  return 0;

Anyone has a clue ? We really need to use wglMakeCurrent all the time.


I reduced the amount of the apparent leak by using only a single HGLRC for the whole application. This makes the code a bit less portable as the window classes are no longer standalone but it works.

Also something I’ve noticed is that although the apparent amount of memory leaks when a window is redrawn when you watch the process with task manager (task manager is not always good indicator of the true working set), if you minimise the application and maximise it again the memory usage will go right down and build up again as the video ram is de-allocated and reallocated. In my simple mind this is what I believe is happening.

It’s an annoying problem as I have not seen this on other development platforms and I have not yet been able to rule out XP by building on a W2k system that I previously developed on.

I have other problems too that I think may be XP related. I think the solution is going to be to build on a W2k platform for deployment.

I hope that this has helped a little. Overall, my fix is going to be avoid Ati at all times in the future.


It’s a memory leak and plenty of people have had this sort of problem. This one talks about win2k and Geforce card


tho, to make sure, make a normal window, instead of “STATIC” and register the window class, and be sure to use the CS_OWNDC flag.

Then send to devrel@ati.com

I wonder if the resize leak has anything to do with the fact that one of my apps likes to frequently crash on resize. (Dual monitor, easily overflowing onboard memory.)

mcsellski, I don’t think I can keep the same hglrc, as I need to render on 2 different windows and I am using pbuffers.
Anyway, I don’t understand how building on w2k would change anything ? Or do you mean running the app under w2k ?

V-man, I already made the test with a “normal” window. I just made it as short as possible. It as been sent to devrel@ati.com.

I will send another reply if I get feedback.