wglChoosePixelFormatARB

Hi,
I’ve been having a spot of bother using WGL extensions. I’m attempting to use wglChoosePixelFormatARB in conjunction with glClampColorARB to turn off the clamping of colour values output from the fragment shader (rest assured I have a very good reason for wanting to turn off the clamping and have been informed this is the best way to do it; any other suggestions are welcome). However

wglChoosePixelFormatARB = (PFNWGLCHOOSEPIXELFORMATARBPROC)wglGetProcAddress( "wglChoosePixelFormatARB" );

returns NULL. I can’t even call

wglGetExtensionsStringARB = (PFNWGLGETEXTENSIONSSTRINGARBPROC)wglGetProcAddress( "wglGetExtensionsStringARB" );

which also returns NULL. Is there anything I need to make sure I’m doing before these calls bearing in mind this was previously working with a simple ChoosePixelFormat call before I started playing around with the clamping and changed the pixel formats accordingly.

I’m running this on an NVIDIA 6800 with ForceWare version 77.72.

Thanks in advance for any help :slight_smile:

Do you have a context valid when you make these calls?

You need a context active before you can make valid wglGetProcAddress calls. So you typically create a “dummy” context to get the handles.

Even if redundant: have a look at some of humus’ demos, and grab framework. I think that explains it all (even if a bit terse :slight_smile: ).

Thanks for the help guys, I’ve made a bit of progress but am now stumped on something else along the same lines. I managed to get something going from a combination of things I read on NeHe, Humus etc. However while I now get no errors it’s not performing as expected.

What I’m actually after is an output of unclamped float values (32-bits per component if poss) from the fragment shader and my code looks something like this at the moment:

First I set a PIXELFORMATDESCRIPTOR

static PIXELFORMATDESCRIPTOR pfd =
{
	sizeof(PIXELFORMATDESCRIPTOR),
	1,                              
	PFD_DRAW_TO_WINDOW |           
	PFD_SUPPORT_OPENGL |            
	PFD_DOUBLEBUFFER,
	PFD_TYPE_RGBA,
	128,
	32, 0, 32, 0, 32, 0,            
	32,                             
	0,                              
	0,                              
	0, 0, 0, 0,                     
	16,                             
	0,                             
	0,                              
	PFD_MAIN_PLANE,                 
	0,
	0, 0, 0
};

and set a basic pixel format with no WGL stuff (apparently you have to do this first??):

void SetupPixelFormat( HDC hDC )
{
	int nPixelFormat;

	nPixelFormat = ChoosePixelFormat( hDC, &pfd );
	SetPixelFormat( hDC, nPixelFormat, &pfd );
}

Then I sort out the context:

hRC = wglCreateContext( hDC );	// Create the rendering context
wglMakeCurrent( hDC, hRC );	// Make the rendering context

Next I setup the WGL stuff (just before the texture / shaders setup stuff, I’ve also tried placing it afterwards):

bool SetupWGLPixelFormat( HDC hDC )
{
	int nPixelFormat;

	wglChoosePixelFormatARB = (PFNWGLCHOOSEPIXELFORMATARBPROC)wglGetProcAddress( "wglChoosePixelFormatARB" );

	if ( !wglChoosePixelFormatARB )
	{
		MessageBox( NULL, "wglChoosePixelFormatARB not supported", "Error! (SetupWGLPixelFormat)", MB_OK );
		return false;
	}

	BOOL bValidPixFormat;
	UINT nMaxFormats = 1;
	UINT nNumFormats;
	float pfAttribFList[] = { 0, 0 };
	int piAttribIList[] = { WGL_DRAW_TO_WINDOW_ARB,GL_TRUE,
					WGL_SUPPORT_OPENGL_ARB, GL_TRUE,
					WGL_ACCELERATION_ARB, WGL_FULL_ACCELERATION_ARB,
					WGL_COLOR_BITS_ARB, 128,
					WGL_RED_BITS_ARB, 32,
					WGL_GREEN_BITS_ARB, 32,
					WGL_BLUE_BITS_ARB, 32,
					WGL_ALPHA_BITS_ARB, 32,
					WGL_DEPTH_BITS_ARB, 16,
					WGL_STENCIL_BITS_ARB, 0,
					WGL_DOUBLE_BUFFER_ARB, GL_TRUE,
					WGL_PIXEL_TYPE_ARB, WGL_TYPE_RGBA_ARB,
					0, 0 };

	bValidPixFormat = wglChoosePixelFormatARB( hDC, piAttribIList, pfAttribFList, nMaxFormats, &nPixelFormat, &nNumFormats );

	if ( !bValidPixFormat )
	{
		MessageBox( NULL, "Invalid Pixel Format", "Error! (SetupWGLPixelFormat)", MB_OK );
		return false;
	}

	SetPixelFormat( hDC, nPixelFormat, &pfd );

	return true;
}

Then I attempt to turn off clamping:

glClampColorARB( GL_CLAMP_VERTEX_COLOR_ARB, GL_FALSE );
glClampColorARB( GL_CLAMP_FRAGMENT_COLOR_ARB, GL_FALSE );
glClampColorARB( GL_CLAMP_READ_COLOR_ARB, GL_FALSE );

Before finally dumping values out with:

glReadPixels( 0, 0, WINDOWWIDTH, WINDOWHEIGHT, GL_RGBA, GL_FLOAT, fPixels );
glReadPixels( 0, 0, WINDOWWIDTH, WINDOWHEIGHT, GL_DEPTH_COMPONENT, GL_FLOAT, fDepth );

The problem is the values are still clamped……I’m told this is possible in GL so I must be missing something.

Hope these questions aren’t too daft but I’m jumping in at the deep end a bit with OpenGL having only used DirectX previously. So there’s a bit of a combination of internet trawling and guesswork in the code. :smiley:

Thanks again.

Originally posted by sqrt[-1]:
[b]Do you have a context valid when you make these calls?

You need a context active before you can make valid wglGetProcAddress calls. So you typically create a “dummy” context to get the handles.[/b]
Are you sure this will suffice ? Countrary to Linux, Windows pointer functions that are returned depend on the context. Clearly, a GL function pointer, under Windows, might not work (point at the good address), depending on the context.

Here is the quote from the specs (glXGetProcAddressARB):

[b]
Should corresponding functions exist in the window-system specific
layer on non-GLX implementations?

Yes. wglGetProcAddress already exists for Microsoft Windows, and
Apple has stated they will support aglGetProcAddress.
Unfortunately, there is an unavoidable inconsistency with
wglGetProcAddress, which returns context-dependent pointers.
This should be made abundantly clear in the documentation, so
that portable applications assume context-dependent behavior.
[/b]
So, a dummy context might be the worst thing to do under Window. To my opinion…

The context is sorted before any calls to wglGetProcAddress, so that shouldn’t be a problem…I hope :smiley:

Are you trying to create a floating point framebuffer? That’s not supported by the hardware. Rendering to floating point has to be done in off-screen buffers, pBuffers or FBOs.

Originally posted by jide:
[b] [quote]Originally posted by sqrt[-1]:
[b]Do you have a context valid when you make these calls?

You need a context active before you can make valid wglGetProcAddress calls. So you typically create a “dummy” context to get the handles.[/b]
Are you sure this will suffice ? Countrary to Linux, Windows pointer functions that are returned depend on the context. Clearly, a GL function pointer, under Windows, might not work (point at the good address), depending on the context.

Here is the quote from the specs (glXGetProcAddressARB):

[b]
Should corresponding functions exist in the window-system specific
layer on non-GLX implementations?

Yes. wglGetProcAddress already exists for Microsoft Windows, and
Apple has stated they will support aglGetProcAddress.
Unfortunately, there is an unavoidable inconsistency with
wglGetProcAddress, which returns context-dependent pointers.
This should be made abundantly clear in the documentation, so
that portable applications assume context-dependent behavior.
[/b]
So, a dummy context might be the worst thing to do under Window. To my opinion…[/b][/QUOTE]Yes I would also not recommend getting extension entry points from under any context but the one you can use them. However, the wglChoosePixelFormatARB type calls I believe are an exception to this rule as you need a context to get the entry points, but these are functions you use in creating a context. So chicken and egg…

As to the origional poster, take note of what Humus said. Re-reading your post it seems tou are trying to create a floating point frame buffer which is not possible. Also note that you can only set the pixel format once for a window. (looking at your example code)

Ahhh I see, I was told floating point framebuffers were possible. Looks like I’ll have to use FBO’s then. Thanks guys