I'm not able to successfully call wglChoosePixelFormatARB

I’m running Windows 11, VSCode clang++ 18.1.8

I was able to retrieve the function pointer. The issue is when I call the wglChoosePixelFormatARB functions, it returns false and the SM_ASSERT gets called. How can I check which PixelFormats are available?

    wglChoosePixelFormatARB = 
      (PFNWGLCHOOSEPIXELFORMATARBPROC)platform_load_gl_function("wglChoosePixelFormatARB");
    wglCreateContextAttribsARB =
      (PFNWGLCREATECONTEXTATTRIBSARBPROC)platform_load_gl_function("wglCreateContextAttribsARB");

    if(!wglCreateContextAttribsARB || !wglChoosePixelFormatARB)
    {
      SM_ASSERT(false, "Failed to load OpenGL functions");
      return false;
    }
    {
      RECT borderRect = {};
      AdjustWindowRectEx(&borderRect, dwStyle, 0, 0);

      width += borderRect.right - borderRect.left;
      height += borderRect.bottom - borderRect.top;
    }

    window = CreateWindowExA(0, title, // This references lpszClassName from wc
                            title,    // This is the actual Title
                            dwStyle,
                            100,
                            100,
                            width,
                            height,
                            NULL,     // parent
                            NULL,     // menu
                            instance,
                            NULL);    // lpParam

    if(window == NULL)
    {
      SM_ASSERT(false, "Failed to create Windows Window");
      return false;
    }

    dc = GetDC(window);
    if(!dc)
    {
      SM_ASSERT(false, "Failed to get DC");
      return false;
    }

    const int pixelAttribs[] =
    {
      WGL_DRAW_TO_WINDOW_ARB,                       1,  // Can be drawn to window.
      WGL_DEPTH_BITS_ARB,                          24,  // 24 bits for depth buffer.
      WGL_STENCIL_BITS_ARB,                         8,  // 8 bits for stencil buffer.
      WGL_ACCELERATION_ARB, WGL_FULL_ACCELERATION_ARB,  // Use hardware acceleration.
      WGL_SWAP_METHOD_ARB,      WGL_SWAP_EXCHANGE_ARB,  // Exchange front and back buffer instead of copy.
      WGL_SAMPLES_ARB,                              4,  // 4x MSAA.
      WGL_SUPPORT_OPENGL_ARB,                       1,  // Support OpenGL rendering.
      WGL_DOUBLE_BUFFER_ARB,                        1,  // Enable double-buffering.
      WGL_PIXEL_TYPE_ARB,           WGL_TYPE_RGBA_ARB,  // RGBA color mode.
      WGL_COLOR_BITS_ARB,                          32,  // 32 bit color.
      WGL_RED_BITS_ARB,                             8,  // 8 bits for red.
      WGL_GREEN_BITS_ARB,                           8,  // 8 bits for green.
      WGL_BLUE_BITS_ARB,                            8,  // 8 bits for blue.
      WGL_ALPHA_BITS_ARB,                           8,  // 8 bits for alpha.
      0                                              
    };

    UINT numPixelFormats;
    int pixelFormat = 0;

    if(!wglChoosePixelFormatARB(dc, pixelAttribs,
                                0, // Float List
                                1, // Max Formats
                                &pixelFormat,
                                &numPixelFormats))

    {
      SM_ASSERT(0, "Failed to wglChoosePixelFormatARB");
      return false;
    }

This should be 24; the WGL_ARB_pixel_format extension specification says

You should probably set WGL_SAMPLE_BUFFERS_ARB to 1 if you’re setting WGL_SAMPLES_ARB.

You might want to try removing the WGL_SWAP_METHOD_ARB setting and leaving it up to the driver.

Note that it only returns GL_FALSE if there’s some kind of error. If there are no matching formats, it sets numPixelFormats to zero and returns GL_TRUE.

If there’s an error, you can call GetLastError to obtain more detail.

Thank you that helped. This cpp file was built following a CelesteClone tutorial on youtube. I would really like to learn openGL the hard way. I narrowed it down to the core problem. Here is what I discovered.

For reference: CelesteClone/src/win32_platform.cpp at main · trist007/CelesteClone · GitHub

If I comment out the FakeRC and FakeDC sections then my program progresses. I don’t understand why people advise to create a fake Rendering Context and a fake Device Context but it seems you need those up in order to query the opengl system.

In any case it looks like I was not able to properly load the two key functions from the C:\windows\system32\opengl32.dll file.

Why wouldn’t that opengl32.dll file not contain these two functions? Maybe my platform_load_gl_function is not correct. What do you guys advise? I’m trying to initialize OpenGL and create a basic window in Windows 11.

wglChoosePixelFormatARB = 
      (PFNWGLCHOOSEPIXELFORMATARBPROC)platform_load_gl_function("wglChoosePixelFormatARB");

    wglCreateContextAttribsARB =
      (PFNWGLCREATECONTEXTATTRIBSARBPROC)platform_load_gl_function("wglCreateContextAttribsARB");

Because they’re extensions (hence the “ARB” suffix). opengl32.dll only exports the OpenGL 1.1 API (and the corresponding WGL functions). Any functions added by extensions or by later versions of the OpenGL API must be accessed via a function pointer obtained from wglGetProcAddress. That is presumably what platform_load_gl_function uses.

Bear in mind that the version of OpenGL which a system supports isn’t determined by Windows itself, but by the video driver. opengl32.dll is largely a wrapper which simply calls the appropriate video driver to do the actual work. On systems with more than one GPU, you could have a program with multiple windows on different displays and OpenGL calls for one window get passed to a different driver than those for another window. But opengl32.dll is part of Windows and exports a specific set of functions regardless of which video drivers are installed.

In order to call wglGetProcAddress, a valid OpenGL context must be created and made current. This is true for (almost) any OpenGL function. But it presents a particular issue for context creation because you can’t obtain a pointer to wglCreateContextAttribsARB (which creates a context) without already having a valid context. So you have a “chicken and egg problem”. The solution is to first use the older ChoosePixelFormat and wglCreateContext functions to create a context, because those are among the few OpenGL-related functions which can be called without a context, then use that “fake” context to obtain pointers to the newer functions.

When you call wglGetProcAddress, you aren’t obtaining a pointer to a function which is in opengl32.dll, but one which is in the (hardware-specific) video driver. Note that if the driver doesn’t support the extensions in question (WGL_ARB_pixel_format and WGL_ARB_create_context), wglGetProcAddress will return NULL. This can happen if the video driver isn’t installed correctly (in which case, OpenGL will fall back to Microsoft’s software implementation, which only supports OpenGL 1.1).

1 Like

I was not able to call wglChoosePixelFormatARB successfully even though I was able to extract a valid pointer. it simply returns FALSE(the windows BOOL) so I cannot get a good pixelFormat.

However, I did notice that if I comment these lines out in the previous code block(these were done for the dummy context to retrieve the functions pointers for wglChoosePixelFormatARB and wglCreateContextAttribsARB) then wglChoosePixelFormatARB does return successfully.

    // Clean up the take stuff
    wglMakeCurrent(fakeDC, 0);
    wglDeleteContext(fakeRC);
    ReleaseDC(window, fakeDC);

    // Can't reuse the same (Device)Context,
    // because we already called "SetPixelFormat"
    DestroyWindow(window);

So am I supposed to leave the dummy context and window up and running? The tutorial I am following says to close and clean them up.

So if I close them and clean them up then I am able to initialize opengl (GL_VERSION: 4.6.0 NVIDIA 566.36) using the normal ChoosePixelFormat, SetPixelFormat, wglCreateContext ,wglMakeCurrent etc.

However, now I’ve run into another issue, my vertex shader fails to compile and does not really provide an error. I’ve inserted glCheckError() as well as inspected the glGetShaderInfoLog but the log contains NULL.

I used a linter on my glsl assets/shaders/quad.vert and it checks out.

Any idea of what may be causing the shader compilation failure? I’m thinking that I really need to initialize using wglCreateContextAttribsARB) then wglChoosePixelFormatARB.

I may try GLUT or GLFW but I really wanted to stick with low level stuff so that I understand it better.