Anti-aliasing in real time !

I’ve used the standard approaches to anti-aliasing in OpenGL - using the accumulation buffer and also polygon/line smoothing. Whilst these techniques work, the frame rates collapse from ~100/sec down to 1-2/sec ! (Even with hardware acceleration).

Is there any other method to remove the ‘jaggies’ whilst maintaining workable frame rates or should I forget anti-aliasing ?

Many thanks

Andrew Jameson

Accumulation buffer support on consumer hardware is pretty much nonexistent at the moment, so I’d say you’re being kicked into software when you try it. Polygon antialiasing requires all your primitives to be depth-sorted, which isn’t usually the most efficient way to sort them.

To be honest, I wouldn’t try antialiasing in application code at the moment. Next-gen chipsets (e.g. Glaze) will be able to do it in hardware at reasonable framerates, just by supersampling (i.e. rendering the image at a higher resolution then scaling it down to screen size with an interpolating filter).

Yes, i know this prob ! Do you like to do fullscreen-aliasing ? This the big difference between “mass-chip” 3dcards, like TNT2 or Geforce, and a “real” hardware OpenGL-accelerator - the second card does not have performance lags. Three days ago i had an discussion with an developer of Evan&Sutherland, a experienced firm in GL-hardware. We taked about this performance problem, and he sayd, that this is unsolveable with a “mass-chip-card”.
Full-screen-aliasing is very power-expensive - using Gl-extensions doesn´t help ! Beware of doing full-screen aliasing !

Dear Sir:
Sorry to disturb you.
I am a fan of Computer Graphics and want to ask you two questions.First:I want to
know which REAL OpenGL-accelerator card now can perform real time fullscreen-aliasing
and how about ELSA GLORIA II & ES Tornado 3000.The second question:How can I implement
the function with OpenGL if the card supports it.
Thanks a lot.


Dear Sir,

thankyou for your interest in our opengl product line. your formality has been noted, and we can recommend you chill out. relllaaaaaxx. ))

it depends on what you mean by “supporting” it. if your card supports the accumulation buffer, then you have to do antialiasing manually. (I guess you mean antialiasing, since all cards can alias to their hearts content By “manually” i mean you have to explicitly redraw the scene with subpixel camera offsets. (Opengl will take care of blending them together for you, tho’)

if you card is really cool, then it might be able to do antialiasing by.,… well, a couple of reasons. having an increased virtual resolution is one of them, but a slightly different tack it to create “buckets” within a pixel. they’re effectively the same sort of thing, tho’, and it’s transparent to the prorgammer. in this instance, you’d give opengl the hint to make polygons etc look “nice”. and cross your fingers

hope this helped!


for hardware anti-aliasing, check out either the diamond fire GL1, or some of the high end 3D Labs Oxygen cards.

and then look at the price.

Can’t have it all I guess.

check out the mercury card. it’s insane. it has four obsidian cards wired together by an interconnect bus and it does subpixel sampling. check out their picture it’s almost funny; it looks like a mighty hack (wiring parts together without redesigning), but… wow. (It uses a total of 24 voodoo2 chips, 6 per card