Antialiasing

OK, I guess I´m very close to get it working, but I have got one Problem left.

I use Borland C++ Builder 5 and it´s forms.
I only have got one form for the OGL program.

Now if I setup a temporary GLRC in order to initialize the WGL functions, that works.
But if I try to call SetPixelFormat a second time (after using wglChoosePixelFormatARB) with the Pixelformat index, that IS capable of AA (checked the index and it´s values with the NVIDIA Pixelformat App), the call fails.

I think the NVIDIA doc mentions this in some way (when you try to use the Dektop Window hWnd an hDC).

So I guess I have to setup a REAL new window, in order to retrieve it´s hWnd and hDC first, then initialize the WGL functions and after that the second call of SetPixelFormat should work.

But my question is (guess for many people here a pretty simple one g):

HOW do I create a temp Window?
I guess I have to use CreateWindow or CreatWindowEx function, but I NEVER had to use them, becaus I use Borland´s forms.

Any very easy samples for that (I know the NeHe Tuts use the CreateWindowEx function, but SO MANY lines of code for “ONLY” a new window)?

Please HELP me .

Diapolo

[This message has been edited by Diapolo (edited 02-14-2002).]

Hi folks, I’m back again. This just keeps getting more frustrating all the time.

I just bought a new GF4/MX400 ($139). Of course I thought that any GF4 board must have the multisample programming support working by now. Wedge said that he has it working on GF3 boards.

Well guess what, I run the Nvidia PxlFmt.Exe program and it shows NO AA PixelFormats again! My code says the same thing. So this still is no working.

I also noticed that as wedge mentioned that this GF4/MX board does NOT have the GL_MULTISAMPLE in the ext either!

I am beginning to think that Nvidia does not support the programming interface for the multisample mode on ANY mx board. I’m going to take this GF4/MX board back and get the GF3/TI ($199) instead. Now I know why it costs more money.

My advice to you guys trying to get this working as well, is to download the Nvidia PxlFmt.Exe program right away and test your cards first (I gave the link above). If you don’t see any PixelFormats with ‘1’ in the AA column – your card is not going to work. You better find out if your card is even capable of offering these pixel modes.

To Diapolo:
Yes, you MUST use a temp wnd. Once you set a pixel format to a window, it won’t let you change it. That’s why your second call failed. I saw the same thing.

I don’t use the Win API CreateWindow procs. However you need the message pump/handler from the main application. It’s best just to create a form during runtime. I use FormTmp= TForm.CreateNew(). You don’t need to actually show the form. After you do the createnew, then you can start using the window Handle from it to get your DC setup and then the rendering context. Grab your pointers to the wgl procs, then deactivate the contaxt, delete the context, and then delete the FormTmp. From there you can setup the real window with the new wgl calls for AA formats.

I’ve got the code written for that now, and at least that’s what I expect to see happen whenever I get a board that actually gives me AA pixel formats!

Bye… Chris.

Originally posted by Chris_S:
[b]I just bought a new GF4/MX400 ($139). Of course I thought that any GF4 board must have the multisample programming support working by now. Wedge said that he has it working on GF3 boards.

Well guess what, I run the Nvidia PxlFmt.Exe program and it shows NO AA PixelFormats again! My code says the same thing. So this still is no working.

I also noticed that as wedge mentioned that this GF4/MX board does NOT have the GL_MULTISAMPLE in the ext either!

I am beginning to think that Nvidia does not support the programming interface for the multisample mode on ANY mx board. I’m going to take this GF4/MX board back and get the GF3/TI ($199) instead. Now I know why it costs more money.
[/b]

Well, unfortunately, nVidia listened to their marketing people instead of ignoring them when they were coming up with names for the new inexpensive boards. The GF4/MX is closer to a GeForce2 than a GeForce3, much less a GeForce4TI.

However, while I haven’t tried the PxlFmt.exe program you mention, I have successfully produced FSAA windows using my GeForce 3 in my own program (using the info in the nVidia doc). So it does definitely work using a GeForce 3 (at least with the 23.11 drivers, anyway).

One thing I did notice, is that program crashes when using an FSAA window seem significantly more likely to cause the driver to get corrupted, requiring a reboot. In fact, on my computer at least, I occasionally have to power down. Rebooting doesn’t always fix the problem.

–Travis Cobbs

Well it looks like we have one of the first of what could be many dissatisfied NVIDIA customers.

GF4 MX may yet be renamed Dan Vivoli’s folly, but what was the alternative? It’s not even a GF3, so you’d have had GF4 on the high end and GF2 uber-ultra-ptang-ptang at the low end.

It’s still a nice card, but MX may be too cryptic a label.

@Chris_S: Dorbie is right when you want use extensions DON’T by a card with MX we also hav problems on the GF2/MX. If I use the
GL_TEXTURE_RECTANGLE_NV and GL_EXT_texture_env_combine extension I get problems on the GF2/MX. But I never examine the Problem close, we do not suport MX Cards g.
The GF4/MX base on the GF2/MX. So I thing it will have the same Problems and do not suport all extensions.
So by a GF3 or GF4/TI. But the 4 do not have so many new things. A better schader a new AA methode and a it is litle faster. In my opinion the Intristing thing on GF4 is that you can use 2 Displays.
I heard the new AA methode on the GF4 only works with DirectX? Everyone know something?

Does anyone have a simple program with antialiasing that i could use to test my Kyro 2,as my prog doesnt seem to work and i cant check the pixel formats as the Nvidia pixel format prog wont run on my pc either, i guess it requires an Nvidia card, i have a kyro 2 which supports antialiasing surposedly

Basically has anyone got a prog which will draw a triangle and show me it antialiased or crashing?

To Chris_S:

You are on Borland, too, right?
Or are there forms in VC++, too?

I grabbed a simple code for creating a window from some webpage, could anyone have a short look if it could be problematic somewhere?

LRESULT CALLBACK WindowProcedure(HWND hWnd, UINT msg, WPARAM wParam, LPARAM lParam)
{
return DefWindowProc(hWnd, msg, wParam, lParam);
}

void SetTempWindow(void)
{
bool bSuccess = true;
HWND hWnd = NULL;
HGLRC hGLRC = NULL;

WNDCLASS wClass;
wClass.style         = CS_OWNDC;
wClass.lpfnWndProc   = WindowProcedure;
wClass.cbClsExtra    = 0;
wClass.cbWndExtra    = 0;
wClass.hInstance     = HInstance;
wClass.hIcon         = LoadIcon(HInstance, IDI_WINLOGO);
wClass.hCursor       = LoadCursor(NULL, IDC_ARROW);
wClass.hbrBackground = NULL;
wClass.lpszMenuName  = NULL;
wClass.lpszClassName = "OpenGL";

RegisterClass(&wClass);

hWnd = CreateWindow("OpenGL",                       // Class Name
                    "Temp OGL Window",              // Window Name
                    WS_DISABLED,                    // Window Style
                    CW_USEDEFAULT, CW_USEDEFAULT,   // Starting Position
                    0, 0,                           // Breite und Höhe;
                    NULL,                           // parent handle;
                    NULL,                           // menu handle;
                    HInstance,                      // instance handle;
                    NULL);                          // other parameters;

if(hWnd == NULL)
{
    // FEHLER
}

… more code
}

By the way, the AA works for me on GF3, with 2 or 4 samples, in 16 or 32 Bit and with the GL_NV_MULTISAMPLE_FILTER_HINT .

Diapolo

[This message has been edited by Diapolo (edited 02-15-2002).]

Hi

I also have a GF2MX400 and nvpixelformat doesn’t recognize AA.
But when I manually force the driver to do 2x or 4x AA it gets rendered with AA but not under the control of the app.

Bye
ScottManDeath

I think the solution to the whole problem is, that only GeForce3 (and higher) supports Multisampling AA (and therefore only GF3 and higher support the required GL and WGL Extensions).
I dunno what the ATI cards are capable of.

A GF2 / GF2MX and all the NV cards below GF3 support a form of AA, that is called Supersampling (Perhaps this can be turned on with: glEnable(GL_POLYGON_SMOOTH); The red book says, that blending has to be active in order to work. There is also a hint available, that could be usefull to chose the quality glHint(GL_POLYGON_SMOOTH_HINT, GL_NICEST or GL_FASTEST) .
But it could be, that the SS AA can only be turned on via the NV control panel.

Diapolo

[This message has been edited by Diapolo (edited 02-15-2002).]

Originally posted by Diapolo:
A GF2 / GF2MX and all the NV cards below GF3 support a form of AA, that is called Supersampling (Perhaps this can be turned on with: glEnable(GL_POLYGON_SMOOTH); The red book says, that blending has to be active in order to work. There is also a hint available, that could be usefull to chose the quality glHint(GL_POLYGON_SMOOTH_HINT, GL_NICEST or GL_FASTEST) .
But it could be, that the SS AA can only be turned on via the NV control panel.

While the GL_POLYGON_SMOOTH_HINT will likely work, it’s completely unrelated to FSAA. You have to sort your polygons back to front before drawing them in order for the hint to produce good results.

Also, while it runs significantly slower, the supersampling AA implemented on the GF2, etc cards is capable of producing significantly better results in certain situations (for example, when used with alpha textures).

One thing I forgot to ask. For those of you trying this and having it fail on GF2 hardware, do you have the latest drivers from nVidia installed? I don’t think the extension was supported in the pre-GF3 drivers at all.

–Travis Cobbs

Hi

I’m using th 27.42 drivers and as said befor nvpixelformat doesn’t want to let me AA test

Bye
ScottManDeath

Success. I traded the GF4/MX on a GF3/TI200, and yes it DOES have progammable PixelFormats. Basically anything above an MX should have these modes; but the GF2, GF3, or GF4 MX all do NOT. The MX cards show 57 PixelFormats and the TI200 shows me 76 PixelFormats. AA modes all over the place. Whooping big change. (BTW this card sure is speedy.)

I will post a link shortly to an OpenGL test program I have that will work on any card, such as Mr_Smith with his Kyro. But my guess is that it probably does not support PAA (programmable anti alias).

A lot of cards list HRAA, FSAA, SSAA, etc. on their specs but you can’t tell if they have PAA. I’d bet all the bottom end cards do not.

I’d be interested to know which of the other bands and models support PAA. The test program I’ll link shortly has a place to email me your cards results. It would be interesting to know just how many and which ones do support PAA. I’ll post the results.

(TO: D, Yes I use Borland Delphi, C-Builder, JBuilder, MSVC, MASM.)

Thanks, Chris.

What is programmable AA or what do you mean by it?

I think the MULTISAMPLE Extensions are only for doing MS AA and not for doing Supersampling AA.

If this is the case I don´t understand, why the other non GF3 / GF4 cards (should) list the WGL_ARB_multisample extension in the WGL extensions string.

You all said, that you could not get any Pixelformat, that supports AA with pre GF3 / GF4 cards, so I don´t think MS AA will work on these cards, at all.

But that leads me to the question, how to use application controlled AA on pre GF3 / GF4 cards, if not via GL_POLYGON_SMOOTH?

Would be pretty cool if someone could clear up these things .

Diapolo

Here is a link to my OpenGL test program. It will list all the extensions and PixelFormats of any 3D card, and give you a bunch of other stats on the card. I’d like to see the results of some of these other brands of cards. You can email me the results of your card using the link on the About tab.
http://www.linearx.com/cgi-bin/filebot.pl?/misc/OpenGLtp.zip

PAA is simply my term I use to indicate the presence of the ability to control the multisample mode under “progamming” control.

Supersampling I believe was a term created by Nvidia, and I don’t think it is an OpenGL standard. I think Nvidia came out with that prior to the multisampling standard, and it has now been replace by multisampling. Supersampling was only controllable from the ControlPanel.

The MX cards can do multisampling, but you can only activate it globally using the Nvidia Control Panel (I guess you could consider this their supersampling mode). You can’t do it using OpenGL calls from an application program. Hence what I mean by lack of ‘PAA’.

As Wedge accurately stated, in oder to have ‘PAA’ there must be two extensions present: GL_ARB_multisample and WGL_ARB_multisample. The MX boards do not have the former.

GL_POLYGON_SMOOTH and GL_POLYGON_STIPPLE have nothing to do with multisampling. They concern the rasterization pattern within the polygon itself.

Bye… Chris.

The download to your test program seems broken, I only get a 1KB file and Winzip tells me it´s corrupt .

OK, I guess we have got the same opinion about the “PAA”.

But WHY does NVIDIA list the WGL_ARB_multisample extension on non GF3 / GF4 cars, if it´s not usable for OGL programmers (we don´t get an AA capable Pixelformat and because of that application controlled AA is impossible on this cards under OGL)?

By the way, I don´t think Supersampling is an NVIDIA term, but I could be wrong.
It means, you render the scene in a higher resolution and sample it down to the screen resolution, which means edges get smoothed.

Perhaps a person from NVIDIA could make us clear, why the WGL extension is listed or why we can´t use it.

Diapolo

UPDATE:

This seems interesting (got it out of the OGL 1.3 specs)

F.3 Multisample
Multisampling provides a antialiasing mechanism which samples all primitives
multiple times at each pixel. The color sample values are resolved to a single, displayable
color each time a pixel is updated, so antialiasing appears to be automatic
at the application level. Because each sample includes depth and stencil information,
the depth and stencil functions perform equivalently to the single-sample
mode.
When multisampling is supported, an additional buffer, called the multisample
buffer, is added to the framebuffer. Pixel sample values, including color, depth, and
stencil values, are stored in this buffer.
Multisampling is usually an expensive operation, so it is usually not supported
on all contexts. Applications must obtain a multisample-capable context using the
new interfaces provided by GLX 1.4 or by the WGL ARB multisample extension.
Multisampling was promoted from the GL ARB multisample extension; The
definition of the extension was changed slightly to support both multisampling and
supersampling implementations.


There it says multisampling AND supersampling, but how can we use the supersampling ?

[This message has been edited by Diapolo (edited 02-16-2002).]

You probably tried to do a RightClick and got the html header page. That won’t work. Just click on the link. The file bot should give you an initial html window that lists the file as 0.41MB. I just downloaded it and unzipped it with no problem.

Both the GL_… and WGL_… extensions have to be present to have programmable AA control. At least that is the way the Nvidia cards behave. Don’t ask me why. If we get some other people downloading my test program, we may see if the other cards behave differently.

There are no “supersampling” tokens or extensions in OpenGL, that is why I say it is an Nvidia name.

The generic term for this whole area is AA - AntiAliasing. That is simply rendering to a higher resolution buffer, and then downsampling (averaging, interpolating) to the final screen resolution. OpenGL provides a standardized API call set under the name “multisampling” for this buffer.

Individual vendors have made marketing names up for this like High Resolution AntiAliasing (HRAA), Full Screen AntiAliasing (FSAA), Accuview, etc. Nvidia has their special Quincuix AA algorithm, on and on. They all add different bells and whistles to the “AA” arena. But many if not most of the AA solutions are only controllable from the Control Panel - not through the app.

Bye… Chris.

The generic term for this whole area is AA - AntiAliasing. That is simply rendering to a higher resolution buffer, and then downsampling (averaging, interpolating) to the final screen resolution. OpenGL provides a standardized API call set under the name “multisampling” for this buffer.

I don´t agree with you.
The generic term is Antialiasing, that´s right, but Multisampling and Supersampling are 2 very different methods of achieving the Antialiasing.
Supersampling renders scenes in higher resolutions and samples it down to the screen resolution, while multisampling is achieved via a set of screen resolution images, that are rotated in some way and modulated together (don´t know if that´s exactly like it works, but I think that should be the right direction for multisampling).

There are no “supersampling” tokens or extensions in OpenGL, that is why I say it is an Nvidia name.

There are no tokens, that have got supersampling in it, but the OpenGL 1.3 Specification mentions:

Multisampling was promoted from the GL ARB multisample extension; The
definition of the extension was changed slightly to support both multisampling and
supersampling implementations.

And there you see, multisampling and supersampling are 2 different forms of implementing the AA .

But still the question:

I know pre GF3 / GF4 cards do Supersampling, but how can we control it, if the GL 1.3 Specs say there is a way?
Perhaps NVIDIA has to update his drivers in order to get the supersampling AA to work?

Diapolo

UPDATE:

I got your test program and it works (had to disable my download manager). I have got 84 Pixelformats and many of them with AA sample buffers (2 and 4 samples), like I saw before with the NVIDIA PixelFormat 1.0 App.

GeForce 3 with 27.30 drivers!

[This message has been edited by Diapolo (edited 02-16-2002).]

As your quote says:

“Multisampling was promoted from the GL ARB multisample extension; The definition of the extension was changed slightly to support BOTH multisampling and supersampling implementations.”

We may be splitting hairs here. They are talking about slight differences in implementation, and as it says they were BOTH rolled into the common “multisample” calls in OpenGL.

There are different option modes (1.5x1.5), (2x2), (4x4), Quin, etc. which may reflect these differences in the resolutions or algorithms used, and supersampling may at one time have been called one of these specific modes. But, my point is there is only one AA frame buffer and it is accessed by the “multisample” catagory of OpenGL calls.

Bye… Chris.

I got interested in looking further at the AA capabilities of these other cards, so I downloaded some info from ATI and Kyro. The Kyro doc gives some general mention of AA abilities but not much in the way of public technical details. Whether it has prog AA is hard to determine, but doubtful. Perhaps Mr_Smith can test his card and report back.

The ATI site had some interesting technical info. They are now pushing a new AA method called “SmoothVision” with oversampling levels of 2X, 3X, 4X, 5X, and 6X. But the new approach they are attaching to the “SmoothVision” moniker is “jittered” oversampling pixel locations. Rather than using fixed pixel coords, either straight or on an angle, they are moving them around in a somewhat random fashion. Just like audio dithering, we now have video pixel dithering.

As ATI mentions the terms supersampling and multisampling are becoming blurred:

“The two most commonly used anti-aliasing solutions are ‘Super sampling’ and ‘Multi-sampling’. Both of these methods blend sub-pixel samples to correct aliasing, but generate pixel samples in different ways. The exact distinction between these two methods is somewhat unclear, and has been defined mostly by marketing material more than anything else.”

While ATI has some good enhancments to the AA scene, it appears that they are not at present supporting these AA modes under programmable control. I downloaded two files glATI.h and wglATI.h which they state give the OpenGL Extension support available for their cards:

/* GL_ARB_multisample
** Rage 128 * based : Not Supported
** Radeon * based : Not Supported
/
/
WGL_ARB_multisample
** Rage 128 * based : Not Supported
** Radeon * based : Not Supported
*/

In spite of their new AA enchancements, it appears that ATI is not yet supporting prog AA at app runtime. Control Panel setup is the only way. Perhaps that will change in the future.

So at present the only video cards (at least in the PC market) that fully support programmable AA seem to be the Nvidia TI or higher cards.

Bye… Chris.

I tried the program

On the Kyro 2 the only information it could get was the extensions and the resolution ( as well as the system info ) everything else crashed including the pixelformats tab which came up with the error invalid DC.
On the Rage 128 i had the same error exactly except it could get the name of the card.
On the Savage 4 everything worked, the tests all ran at about double the good value except the last two tests which locked the system up.

Was this written for Nvidia cards?
Help?

ps all cards were in diffrent systems etc all running windows 98.

Extensions list
Kyro.2
GL.ARB.multisample…GL.ARB.multitexture…GL.ARB.texture.compression
GL.ARB.texture.env.add…GL.ARB.texture.env.combine…GL.ARB.texture.env.dot3
GL.EXT.abgr…GL.EXT.bgra…GL.EXT.compiled.vertex.array
GL.EXT.draw.range.elements…GL.EXT.packed.pixels…GL.EXT.secondary.color
GL.EXT.separate.specular.color…GL.EXT.stencil.wrap…GL.EXT.texture3D
GL.EXT.texture.compression.s3tc…GL.EXT.texture.env.add…GL.EXT.texture.env.combine
GL.EXT.texture.filter.anisotropic.GL.EXT.vertex.array…GL.S3.s3tc
WGL.ARB.extensions.string…WGL.EXT.swap.control…WGL.ARB.pixel.format
WGL.EXT.swap.control…GL.EXT.bgra.

Rage.128
GL.ARB.multitexture…GL.ARB.texture.border.clamp…GL.ARB.texture.env.add
GL.ARB.transpose.matrix…GL.ARB.vertex.blend…GL.ATIX.pn.triangles
GL.ATI.texture.mirror.once…GL.ATI.vertex.streams…GL.EXT.abgr
GL.EXT.bgra…GL.EXT.clip.volume.hint…GL.EXT.compiled.vertex.array
GL.EXT.draw.range.elements…GL.EXT.fog.coord…GL.EXT.packed.pixels
GL.EXT.rescale.normal…GL.EXT.secondary.color…GL.EXT.separate.specular.color
GL.EXT.texgen.reflection…GL.EXT.texture3D…GL.EXT.texture.edge.clamp
GL.EXT.texture.env.add…GL.EXT.texture.env.combine…GL.EXT.texture.object
GL.EXT.vertex.array…GL.KTX.buffer.region…GL.MESA.window.pos
GL.NV.texgen.reflection…GL.SGI.texture.edge.clamp…GL.SGIS.texture.border.clamp
GL.SGIS.texture.lod…GL.SGIS.multitexture…GL.WIN.swap.hint
WGL.EXT.extensions.string…WGL.EXT.swap.control…WGL.ARB.make.current.read
WGL.ARB.pbuffer…WGL.ARB.pixel.format…WGL.EXT.swap.control
GL.EXT.bgra

Savage.4
GL.EXT.abgr…GL.EXT.bgra…GL.EXT.clip.volume.hint
GL.EXT.compiled.vertex.array…GL.EXT.packed.pixels…GL.EXT.stencil.wrap
GL.EXT.vertex.array…GL.KTX.buffer.region…GL.S3.s3tc
GL.SGI.cull.vertex…GL.SGI.index.array.formats…GL.SGI.index.func
GL.SGI.index.material…GL.SGI.index.texture…GL.WIN.swap.hint

Seems to totally lose spaces when i use em
Also does anyone know what the SGI extensions are on the savage 4?

[This message has been edited by Mr_Smith (edited 02-17-2002).]