Antialiasing not working with Geforce4 card

This is a complete stumper because it works everywhere but one particular machine that happens to have a more advanced graphics card than any other that I’ve tried my program on. Here’s what I’m executing to turn on polygon antialiasing:

glEnable( GL_BLEND );
glEnable( GL_POLYGON_SMOOTH );
glBlendFunc (GL_SRC_ALPHA_SATURATE, GL_ONE);
glHint( GL_POLYGON_SMOOTH_HINT, GL_DONT_CARE );

The program does VERY simple drawing - just triangles and octagons on a black background. All 2D. (You can go check out the program at www.leadtogold.com/software/genesaver for a screenshot from a working system.) With those lines in there, antialiasing is enabled and life is good. BUT, on this one system, nothing but a black screen gets drawn. (Removing those lines makes the program work, but introduces jaggies, of course.)

The machine in question has the latest blessed drivers from nVIDIA, and the GF4 is supposed to support antialiasing like nobody’s business. It’s running XP, which I at first thought might be the culprit, but a test on another XP box had everything working just fine.

The most promising idea I’ve had yet was to explicitly draw a black background as a giant polygon from (-1,-1) to (1,1), to make sure that the blending works correctly, since it occurred to me that this driver’s particular implementation might be relying on blending the foreground polygons with background polygons, which is the situation you’d always have in 3D games. No such luck, though.

Anyone ever seen anything like this? Are there any dangerous assumptions I’m making in my four lines of code? Help very much appreciated.

[This message has been edited by Samwise415 (edited 01-20-2003).]

When it is said that the GeForce 4 supports antialiasing “like nobody’s business”, they’re not talking about GL_POLYGON_SMOOTH. They’re talking about multisampling or supersampling, or other methods they’ve come up with. I don’t think you can turn on supersampling from an app, but I think you can turn on multisampling.

So do you think they support multisampling but not older glBlend type stuff, to the point of not even rendering anything that tries to use it? Seems kinda hokey - I expect better things from nVIDIA. (Especially since two of the other machines I tested on have a GF3 and a GF4MX.)

Do you reckon there’s any way I can ask the system whether it supports the brand of antialiasing I’m trying to use?

I thought I had this working on a GF4600 the other day though ended up using glBlendFunc(GL_SRC_ALPHA,GL_ONE) instead.

Make sure you have GLUT_ALPHA in the glutInitDisplayMode call.

Also check you are drawing your primitives with alpha set to 1 and you clear the screen with an alpha of zero.

I can’t explain why it would work on two of your geforces but not the third.

W00t! That did it, Adrian! Thanks!!! I haven’t tested it on all my available systems yet, but it works on the problem child.

I’ll give you a mention in my About box once I add a list of acknowledgements. :slight_smile:

You really should heed what korval wrote. Fast antialiasing is achived using the multisample (and supersample) style of rendering. Your saturate alpha aproach requires a global polygon level sort and blending of every fragment. It does yield very high quality results but it’s not really what anyone means when they talk about hardware antialiasing ability of graphics cards. A lot of graphics hardware has always been able to perform blended primitive antialiasing, but it has rarely been used in earnest because of the performance and sorting issues, and it’s not sold as hardware anti-aliasing except perhaps w.r.t. antialiased line rendering for CAD markets.

P.S.

Here is a PDF you might find useful:
http://developer.nvidia.com/docs/IO/2644/ATT/GDC2002_Multisample.pdf

Korval: just a remark, you can control the FSAA from an application, its just a matter of selecting the right pixelformat.