memory leak from closing pixmap windows


Hope one of the OpenGL gurus can give us a hand.
I’ve got a sample code below, but basically it results in a continuously growing memory usage and we have no idea why.
Under OSX it accumulates the memory in the sample program, but under LINUX it adds the memory to X and eventually results in X crashing and restarting itself.
We ran this code and used ‘top’ in another window to look at X’s memory usage. The memory usage for X went up 10 megs for every 30000 loops of the code.
The way we would like the program to run is to not have a continuous increase in memory, so if anyone could show us how to do that, we would be grateful. The only clue is that the sample code doesn’t exhibit that behavior when the call to glXChooseVisual is commented out.

Thank you.


int main( int argc, char** argv ) {
Display *display;
int i;
XVisualInfo *vi;
int configuration[] = { GLX_DOUBLEBUFFER, GLX_RGBA, GLX_DEPTH_SIZE, 16,

for ( i = 0 ; ; i++ ) {
/* Open X Display */
display = XOpenDisplay(NULL);
if (display == NULL) {
fprintf(stderr,"Cannot open display.

   if (!glXQueryExtension(display, NULL, NULL)) {
       fprintf(stderr,"X server has no OpenGL GLX extension

} vi = glXChooseVisual(display, DefaultScreen(display), &configuration[1]);

   /* Clean Up */
   XFree( vi );
   XSetCloseDownMode(display,  DestroyAll);

   printf("Opened and closed X window %d times\r", i);


a look into glXChooseVisual’s man page says that
it uses XGetVisualInfo- which returns a LIST of matching visuals. so the problem might be that glXChooseVisual gets a list of visuals from XGetVisualInfo and does not delete it.

anyway, what do you need thousands of visuals for in a single program- or do you want to simulate thousands of program calls?

I could not replicate the problem with a RedHat
linux 2.6.x and X 4.2.1 with NVIDIA drivers. My
X memory usage is constant.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.