little problem with WGL_ARB_pixel_format


I try to enumerate, how many pixelformats the driver gives me, via this code:

int iNum_PFs = 0;
int iPFResults[8];
memset(&iPFResults, 0, sizeof(iPFResults));
int iPFAttributes[8];
memset(&iPFAttributes, 0, sizeof(iPFAttributes));


wglGetPixelFormatAttribivARB(hDC, 1, NULL, 1, iPFAttributes, iPFResults);
iNum_PFs = iPFResults[0];

You see I use the parameter 1 after hDC.
I saw in a NV PDF file, that they used 0 there.
If I do this, the return value for the number of PFs is zero, which can´t be right .

Any ideas and is my solution OK or will there be problems if I use the 1 there (because the 1 is the first pixelformat)?


That shouldn’t be a problem. From the spec:

The number of pixel formats for the device context. The <iLayerPlane> and <iPixelFormat> parameters are ignored if this attribute is specified.

Ah OK, thank you very much !


It seems there is a small ‘problem’ in lastest nVidia drivers.
You can see PBrown’s comment about it:

(Yes, currently you can solve it using 1 after hDC).

Hope this helps.

Thanks for the additional info .
I was pretty sure, it´s a driver bug and not my fault .


Originally posted by Diapolo:
[b]Thanks for the additional info .
I was pretty sure, it´s a driver bug and not my fault .


Either way we implement it, it will be a driver bug. That’s what internally inconsistent specs will do to you. :slight_smile:

LOL … but great, that you have fun while saying that .

The ARB should phrase better specs in some cases, right ?


While the spec could have been worded better I think it’s pretty much obvious anyway that the intention was that you should only need to provide a valid pixel format index if the parameter you’re requesting requires it.

Spec language always trumps spec intent. If the spec specifies useless behavior, useless behavior is what you get.

  • Matt

You guys in the ARB really should update it then… You know… Fix these little inconsistencies in the specs…

Just a throught.