I recently added support for buffer regions in my software. It works perfectly on ATI but the call to wglCreateBufferRegionARB returns NULL on NVIDIA H/W and I haven’t been able to figure out why.
The “WGL_ARB_buffer_region” extension string is present and wglGetProcAddress() works fine. GetLastError() returns the error code 0xC00705AA and FormatMessage() with that error code returns a NULL string.
These are the attributes I’m using for the pixel format:
PIXELFORMATDESCRIPTOR pFd;
memset(&pFd,0,sizeof(pFd));
pFd.nSize = sizeof(pFd);
pFd.nVersion = 1;
pFd.dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_STEREO | PFD_DOUBLEBUFFER;
pFd.iPixelType = PFD_TYPE_RGBA;
pFd.iLayerType = PFD_MAIN_PLANE;
pFd.cDepthBits = 32;
pFd.cColorBits = 24;
(I tried with and without PFD_STEREO, which is the only thing that seems remotely unusual, and that didn’t help. Neither did using 16 bits for depth.)
The call looks like this, but any combination of front and back buffer with and without depth makes no difference:
UINT ColourBuffer = m_DoubleBuffer ? WGL_BACK_COLOR_BUFFER_BIT_ARB : WGL_FRONT_COLOR_BUFFER_BIT_ARB;
m_Buffer = wglCreateBufferRegionARB(pDC->m_hDC, 0, ColourBuffer | WGL_DEPTH_BUFFER_BIT_ARB);
This is such an old extension that I’m completely at a loss. It doesn’t work on any GeForce or Quadro card I’ve tried, with any driver version, but works fine on all ATI cards. What am I doing wrong?