Crash after Loop about 70000 times of create, delete, recreate Window on Linux...GL_OUT_OF_MEMORY; Segmentation fault

Loop about 70000 times of create, delete, recreate Window on Linux(not test other platforms), the app will crash, the error is: GL_OUT_OF_MEMORY, OpenGL 1285 Out of memory(when use debug version), OR Segmentation fault(when use release version).

This program is written in plain C (although i, personally, prefer C++ syntax). Save it as quad.c, and compile it with

gcc -o quad quad.c -lX11 -lGL -lGLU

Then, run it with the NVIDIA GraphicsCard on Linux(esp. Ubuntu).

// -- Written in C -- //

#include <stdio.h>
#include <stdlib.h>
#include <X11/X.h>
#include <X11/Xlib.h>
#include <GL/gl.h>
#include <GL/glx.h>
#include <GL/glu.h>

void DrawAQuad()
{
  glClearColor(1.0, 1.0, 1.0, 1.0);
  glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

  glMatrixMode(GL_PROJECTION);
  glLoadIdentity();
  glOrtho(-1., 1., -1., 1., 1., 20.);

  glMatrixMode(GL_MODELVIEW);
  glLoadIdentity();
  gluLookAt(0., 0., 10., 0., 0., 0., 0., 1., 0.);

  glBegin(GL_QUADS);
  glColor3f(1., 0., 0.);
  glVertex3f(-.75, -.75, 0.);
  glColor3f(0., 1., 0.);
  glVertex3f(.75, -.75, 0.);
  glColor3f(0., 0., 1.);
  glVertex3f(.75, .75, 0.);
  glColor3f(1., 1., 0.);
  glVertex3f(-.75, .75, 0.);
  glEnd();
}

int main(int argc, char *argv[])
{
  for (uint64_t i = 0; i < 1000000000; i++)
  {
    Display *dpy = XOpenDisplay(NULL);

    if (dpy == NULL)
    {
      printf("\n\tcannot connect to X server\n\n");
      exit(1);
    }

    Window root = DefaultRootWindow(dpy);

    GLint att[] = {GLX_RGBA, GLX_DEPTH_SIZE, 24, GLX_DOUBLEBUFFER, None};
    XVisualInfo *vi = glXChooseVisual(dpy, 0, att);

    if (vi == NULL)
    {
      printf("\n\tno appropriate visual found\n\n");
      exit(1);
    }
    //else
    //{
    //  printf("\n\tvisual %p selected\n", (void *)vi->visualid); /* %p creates hexadecimal output like in glxinfo */
    //}

    Colormap cmap = XCreateColormap(dpy, root, vi->visual, AllocNone);
    XSetWindowAttributes swa;
    swa.colormap = cmap;
    //swa.event_mask = ExposureMask | KeyPressMask;

    //Window win = XCreateWindow(dpy, root, 0, 0, 600, 600, 0, vi->depth, InputOutput, vi->visual, CWColormap | CWEventMask, &swa);
    Window win = XCreateWindow(dpy, root, 0, 0, 600, 600, 0, vi->depth, InputOutput, vi->visual, CWColormap, &swa);
    if (!win)
    {
      printf("fail to create window\n");
      exit(1);
    }

    XStoreName(dpy, win, "VERY SIMPLE APPLICATION");
    XMapWindow(dpy, win);

    GLXContext glc = glXCreateContext(dpy, vi, NULL, GL_TRUE);
    glXMakeCurrent(dpy, win, glc);

    // glEnable(GL_DEPTH_TEST);
    //sleep(1);//without this all windows after 1st window will be black......

    // while (1)
    //{
    // XEvent xev;
    // XNextEvent(dpy, &xev);

    // if (xev.type == Expose)
    //{
    XWindowAttributes gwa;
    XGetWindowAttributes(dpy, win, &gwa);
    glViewport(0, 0, gwa.width, gwa.height);
    DrawAQuad();
    glXSwapBuffers(dpy, win);
    //sleep(1);
    //}
    // else if (xev.type == KeyPress)
    //{
    glXMakeCurrent(dpy, None, NULL);
    glXDestroyContext(dpy, glc);
    XDestroyWindow(dpy, win);
    XFreeColormap(dpy, cmap);
    XCloseDisplay(dpy);
    //sleep(1);
    // exit(0);
    //}
    //} /* this closes while(1) { */

    if (i % 10000 == 0)
    {
      printf("i(%" PRIu64 " / %" PRIu64 ")\n", i + 1, 1000000000);
    }
  }
  return 0;
} /* this is the } which closes int main(int argc, char *argv[]) { */

modified from: Programming OpenGL in Linux: GLX and Xlib - OpenGL Wiki (khronos.org)

You’re using X and OpenGL. In this code, you’re not checking for errors for either one. You should add both.

ur right. but that’s might not be the key. it’s the simplified demo by the most basic demo found on internet. the original demo which has error-checking also crashed after loop about 65000 times.
Thanks!

Well, it’s not helpful to say you’re getting GL_OUT_OF_MEMORY with Program A and then post Program B. Readers now have no clue what you’re really doing.

Also, code posted on the Internet is rarely bug-free. So regardless, it likely contains bugs.

What you’re doing is pathological and rarely tested because it serves no purpose except to help ensure memory is being cleaned up properly by your app (…which evidence suggests it might not be).

Add error checking to help get a line on your errors. Selectively disable code to help isolate your errors. Also you can try running your app under valgrind to help get a line on if you’re leaking memory (or abusing it) and if so where:

On NVIDIA GPUs, you can use this extension to monitor GPU memory usage and help detect if/when you’re leaking GPU memory:

That said, GL_OUT_OF_MEMORY is a very generic error, so it could refer to GPU memory, CPU memory, or some other limited driver resource.

1 Like