I have 2 problems!

Problem #1:

Are you guys able to enable and disable this feature.

I am succeeding in getting pixelformats that support multisample (2x, 4x, 6x) but when I call glDisable(GL_MULTISAMPLE_ARB), it does not disable. I used glIsEnable to check the state and it returns FALSE yet on the screen, all looks FSAAed.
This is not the first time I’m seeing this!
I’ve been through other cards and drivers as well.

Problem #2:
After I close and reopen my child window a few times, it looks like certain pixelformats disappear and eventually, the drivers says there are no more pixelformats that support FSAA.

I have to close the main window and restart my app to get it working OK again.

Is this a memory leak from the driver? Maybe it is wasting video memory…

PS: I still don’t understand how OBVIOUS bugs like problem #1 slip through QA

I’m noy sure, but for #2 it looks like you don’t release the RC and/or DC correctly. Every new child window requests a pixel format that supports multi sampling, and cannot find a new one if all are used by windows which don’t exist any more.

I can disable it on my GeForce4s, but not on my 9700 Pros.

What did the function wglChoosePixelFormatARB return?

wglChoosePixelFormatARB succeeds. No problem there.

It looks like problem #2 happens when I’m running the debugger and it’s not consistent.

Will ATi ever allow ARB_multisample to be toggled?

I’m sure they will on future hardware, but the R3xx doesn’t support it.

Or it does support, but they just haven’t got around to adding it in the drivers…

[This message has been edited by NitroGL (edited 10-14-2003).]

Originally posted by NitroGL:
I’m sure they will on future hardware, but the R3xx doesn’t support it.

What do you mean by “R3xx doesn’t support it”? They can’t do that. The spec says it has to disable, so it has to disable.

Maybe it already works in D3D.

What do you mean by “R3xx doesn’t support it”? They can’t do that. The spec says it has to disable, so it has to disable.

If someone goes into a driver screen and requests 2xAA, they get 2xAA, irregardless of what the app requests. So, technically, an implementation can be forced to violate the spec, assuming that 2xAA is a violation of the spec.

It is entirely possible that a framebuffer, on their hardware’s implementation, that is set up for AA rendering must always be used for AA rendering. Would you prefer that they just not expose ARB_multisample at all?

The drivers also appear to ignore calls to glSampleCoverage as well.

They could support GLX/WGL_ARB_multisample, and just not support GL_ARB_multisample…

DO’H, I forgot to enable sample coverage. SampleCoverageARB does work.

[This message has been edited by NitroGL (edited 10-14-2003).]

In my app, wglChoosePixelFormatARB is also tempermental, and I am only guaranteed it will work the first time. After that, it will not return any formats, no matter what I ask for. My OpenGL window is a child window of the main window. This happens with my ATI 9700 Pro, but didn’t happen with my GeForce 3.

I’m pretty confident that I am correctly cleaning up all the RC and DC stuff, because I spent a long time trying to get the problem to go away, and looked at a lot of posts here about the “correct” way to shut down OpenGL and bring it back up.

I did find that as long as I cache all the info from all pixel formats the first time around, and implement my own replacement for wglChoosePixelFormatARB, I can successfully use all the pixel formats, even when wglChoosePixelFormatARB refuses to return any.

Has anyone e-mailed ATI devrel about this and gotten a response?

No, I haven’t email ATI about it. I’m sure they are aware of this one.

From what I understand, everything must be FSAAed if it is enabled.

However, it seems that Radeons do not FSAA things rendered with glDrawPixels, and perhaps even glBitmap.

I think they did this to compensate for the problem.

Actually, I wanted to eliminate all use of glDrawPixels but maybe I’ll live it as is.

How could you possibly do AA with glDrawPixels?