It all just started with a simple glReadPixels call: the read data always had 255 as alpha value ignoring the value of alpha set with glClearColor.
After tracking down the problem a little, i noticed that in the context i had been creating had no alpha bits!
After asking for a pixel format with 8 alpha bits, i still got many results but none of them seemed to work with SetPixelFormat…
The code to create the core profile context is the one below:
/* Choose pixel format using extension */
int pixel_fmts[25];
memset(pixel_fmts, 0, sizeof(pixel_fmts));
UINT num_fmts = 0;
const int attrib_list[] = {
WGL_DRAW_TO_WINDOW_ARB, GL_TRUE,
WGL_SUPPORT_OPENGL_ARB, GL_TRUE,
WGL_TRANSPARENT_ARB, GL_TRUE,
WGL_DOUBLE_BUFFER_ARB, GL_TRUE,
WGL_PIXEL_TYPE_ARB, WGL_TYPE_RGBA_ARB,
WGL_COLOR_BITS_ARB, 24,
WGL_ALPHA_BITS_ARB, 8,
WGL_DEPTH_BITS_ARB, 24,
WGL_STENCIL_BITS_ARB, 8,
0
};
wgl.wglChoosePixelFormatARB(*hdc, attrib_list, 0, 25, pixel_fmts, &num_fmts);
/* Populate pfd struct in order to set pixel format */
BOOL pixel_fmt_set = FALSE;
for (unsigned int i = 0; i < num_fmts; ++i) {
PIXELFORMATDESCRIPTOR pfd;
memset(&pfd, 0, sizeof(PIXELFORMATDESCRIPTOR));
DescribePixelFormat(*hdc, pixel_fmts[i], sizeof(PIXELFORMATDESCRIPTOR), &pfd);
/* Set newly choosen pixel format */
pixel_fmt_set = SetPixelFormat(*hdc, pixel_fmts[i], &pfd);
if (pixel_fmt_set == TRUE)
break;
}
assert(pixel_fmt_set);
/* Create context using extension and make it current */
int ctx_attribs[] = {
WGL_CONTEXT_MAJOR_VERSION_ARB, 4,
WGL_CONTEXT_MINOR_VERSION_ARB, 3,
WGL_CONTEXT_FLAGS_ARB, WGL_CONTEXT_DEBUG_BIT_ARB,
WGL_CONTEXT_PROFILE_MASK_ARB, WGL_CONTEXT_CORE_PROFILE_BIT_ARB,
0
};
*hglrc = (HGLRC) wgl.wglCreateContextAttribsARB(*hdc, 0, ctx_attribs);
wglMakeCurrent(*hdc, *hglrc);
Any ideas what might be off? Is it even possible to set a pixel format that supports alpha in windows?