Radeon 9700 gamma corrected AA

Originally posted by V-man:
FSAA sucks if you ask me. Each pixel gets the color from its neighbor,

No. In true multisampled fsaa the samples aren’t shared between neighbouring pixels. The infamous ‘Quincux’ breaks this rule (with bad impact on overall picture sharpness…).

The reason NV 4x AA looks bad is because the samples are arranged in a regular grid (parallel to the pixel grid), and therefore near-horizontal and near-vertical edges have quality that is same as 2x (2 sample) AA. (you can draw this on a piece of paper and prove it to yourself).

Higher resolutions can never remove aliasing. All it does is help hide it. And that only works for certain types of aliasing.

Standard supersampling, which is just rendering at a higher resolution and scaling down, isn’t going to get the job done either. The regular grid approach just doesn’t work.

It seems that the Radeon 9700’s approach is to sample from a distribution of points that have little-to-no correlation with each other. By doing so, their antialiasing actually has a chance of removing aliasing artifacts.

BTW, normal edge-antialiasing (as opposed to quincux, which only antialiases edges) requires that polygons be finely sorted back-to-front, and appropriately clipped if they intersect each other. You don’t want that.

Daveperman,

I don’t see what having a 32-32-32-32 backbuffer has to do with aliasing. The front buffer is still 8-8-8 or 10-10-10 (can anyone confirm if the ATI 9700 actually has a 10-10-10 front buffer mode? I heard it had a 10-bit RAMDAC) and if I draw a white triangle on a black background all that precision it does absolutely nothing to help the stair-stepping. Edge aliasing is caused on contrasting edges, and precision does not eliminate that.

Sample summation with gamma correction makes a HUGE difference. If you don’t agree then you need to do some research on this. It is an absolutely fundamental principal of computer graphics.

Your monitor has a non linear response to voltage, so in 2 sample antialiasing for example 50% coverage from (1.0 + 0.0) / 2.0 = 0.5, this will not LOOK like half brightness on your monitor though, it will look more like a quarter brightness, which is VERY WRONG, your antialiasing will suck You must set gamma correction in hardware to around 2.5 for most monitors to give a linear response and therefore produce correct sample weighting for current hardware. Most gamers play with hardware gamma of 1.0 so antialiasing looks crap.

The real bummer is that for reasons of human perception with an 8 bit framebuffer you NEED gamma correction at around 1 to give uniform increments in contrast sensitivity for a human viewer, so setting hardware gamma to around 2.5 introduces banding in darker areas that is unavoidable even if your artist tries to texture in that space or even if your game software has smart compensation in there.

The ATI announcement is fantastic for a couple of reasons, more precision in general allows gamma to be set high and avoid banding IF the content is there, but more impressively, (and I’m making some assumptions about their real capability here because they are annoyingly silent on this) you can I suspect set gamma in hardware to the value you want for the content, your monitor and human contrast sensitivity, and the AA samples will STILL be weighted correctly. i.e. instead of (1+0)/2 = .5 it sums to some other value that LOOKS like half brightness, so with gamma correct AA (1+0)/2 = .75 for example.

This should make it absolutely clear that gamma correct AA is essential for good antialiasing, and the lower the gamma correction value in hardware the more compensation is required in the sample weighting (and since most users have gamma of around 1 it will make an incredible difference).

All the blends etc still produce what appears like non linear brightness. Longer term what is required is more precision in the content (textures etc), rendering and sample summation in linear space (the way it is now) and gamma correction in hardware with enough precision to avoid the issues currently encountered with 8 bits. That’s why high precision framebuffers and gamma tables are also interesting, but developers will have to USE this stuff and start setting hardware gamma high by default in games to get it right.

In the mean time ATI appear to have solved the problem because someone there ‘gets it’ and has finally put one dirty little secret of the graphics industry to rest. But remember I’m making some assumptions based on a bullet point that looks like they do the right thing.

I’m just wondering how you set the sample gamma with a 9700. To make this work in practice you need two gamma factors, one is the final gamma correction (which already exists on most cards) which can be set for content - user prefs - contrast sensitivity and another which is set to the absolute monitor gamma (or net diff between user and monitor gamma) and is used to weight samples. My question still stands, ATI how do you set the AA sample gamma factor or is it assumed that the monitor required correction of around ~2.4.

BTW looking at a screenshot on YOUR monitor over the web of AA with gamma may be a lost cause. Its not even clear if this feature is enabled or how to set the sample gamma or even if they do the right thing.

Humus, that 9700 AA shot you posted is 6x not 4x.

In all of my time with 3D graphics, I have never really understood Gamma correction/non-linear brightness. Do you know any good books/web-sites on the subject?

Originally posted by dorbie:
Sample summation with gamma correction makes a HUGE difference. If you don’t agree then you need to do some research on this. It is an absolutely fundamental principal of computer graphics.

Here we go again, so now we need to research why aa is a good thing…rather than using our eyes.
It’s like the emperors’ new clothes…
I just don’t buy it - and I suspect consumers (ie. gamers) won’t either.
If an effect isn’t noticeable, then it’s not worth the fill rate hit.
No offense, Dorbie, and I’m sure you can quote lots of white papers and renderman docs to try and persuade me it’s a good thing, but we’re talking about gamers here, who are not going to be watching these images on a cinema screen…(at least not for a long while ).

Korval;

there’s this issue of Dr Dobb’s Journal,
http://www.ddj.com/articles/1999/9909/

I wrote this for a ‘simulation issue’ and was prompted to emphasize simulation but it ended up in a graphics issue. In the article I tried to explain the issue with contrast sensitivity in particular and why this has the effect it does on precision, I haven’t seen anyone else draw those precision graphs before or since.

In that I reference Andrew Glassner’s two volume set “Principals of Digital Image Synthesis”,
http://www.amazon.com/exec/obidos/tg/det…=books&n=507846

And there is Charles Poynton’s book “A Technical Introduction to Digital Video”.
http://www.amazon.com/exec/obidos/tg/det…=books&n=507846

It boils down to these points:

  1. The linear range of values from your framebuffer between 0-1 do not display as a linear range of values on a CRT. In fact they are considerably darker over most of the range.

  2. The correct presentation of images on a CRT must compensate for this inherent flaw in the display technology by either boosting the voltage on the wire to give uniform brightness, or by adjusting the digital values in an image such that the brightness is boosted. This is called “gamma correction”.

  3. The human visual system can detect smaller discrete increments in brightness in darker portions of an image than the lighter portions and the gamma curve of a monitor is useful in digital images (good ones) such that 8 bits of information is pretty good at presenting a non linear range of values than a human.

  4. By an amazing fluke the non linear response of the monitor is pretty close to the response required to give a perceptually uniform brightness increment at each value in a linear scale for the human eye.

  5. Post framebuffer hardware gamma correction while desirable for stuff like linear blending applications (and antialiasing) is actually bad for 8 bit framebuffers because your eye will see discrete value jumps in the shadows.

  6. NO hardware gamma correction where software gamma correction has been applied to a high quality image to produce an 8 bit gamma corrected image is actually a GREAT way to display things like photos, which therefore have built in gamma correction already, but were generated in devices like digital cameras from 10 or 12 bit data.

  7. Unfortunately hardware gamma correction of an 8 bit digital rendered image falls foul of 3) because you will see banding in dark regions. You need source data in high precision to gamma correct to an 8 bit gamma corrected perceptually uniform range.

So you can see the dilema, you want perceptually uniform rendering but computers don’t draw that kind of stuff. Even with a deep framebuffer the source data needs to be sufficient precision.

But think about this for a sec. If you have an 8 bit image that has been generated by a camera for display on a PC it will have built in gamma to perceptually uniform space. You could un gamma correct to 12 bit linear space, render in linear space at high precision and gamma correct in hardware for great results with no banding and all the arithmetic is perfectly correct and the content is correct. The key is more precision in the framebuffer, but that’s not going to work for legacy apps and frankly most graphics software developers have no clue about this :frowning: so even new apps are unlikely to improve. It also requires the user or game to set the cards hardware gamma, which makes the windows desktop look much brighter than normal.

In the mean time AA weightings would be wrong for existing and many new applications because they just won’t do this. The solution? Render in gamma corrected space (or make the assumption that you do), ungamma correct the fragments sum in linear space and gamma correct the result to give accurate sample weightings for your display.

The dilema is how much do you un gamma correct and re gamma correct by. This depends on the hardware gamma setting and the display gamma. Ideally this subsample gamma correction factor should be the net correction between hardware and display gamma. Typically with a normal display and your average PC the fragments would have to be uncorrected with a factor of 2.4 summed and recorrected, to give correct weightings for the display.

Again I don’t know for sure if this is what ATI opted to do, it is what I have advocated in the past, but you can see the assumption in that last sentence. The number I used is 2.4, that needs to be programmable. This needs to work for different displays, and it needs to be the NET of gamma correction and actual display gamma, with the assumption that the hardware gamma is insufficient for the display, (the motivation for doing this in the first place).

[This message has been edited by dorbie (edited 08-22-2002).]

knackered, no, you need to research why gamma correct sample sum is a good thing for AA (actually understanding my preceeding post would do it). The fact that you think AA sucks is BECAUSE your card gamma correction is set to 1, and your monitor requires gamma correction at around 2.4.

Try this: run something with AA (a white poly on a black background would be ideal because the gamma won’t affect the overall appearance of the image). Draw with AA and hardware gamma set to 1, then look at it with your gamma correction set to 2.5 in your advanced display settings. Then tell us what you think of AA. Now, wouldn’t it be nice to get that effect on AA in all your game images without screwing up the overall image brightness in the mid-tones? Exactly!

I don’t need to debate whether AA is a good thing. The rest of the graphics industry saw years ago that AA is essential to image quality. Feel free to catch up whenever.

The performance hit for AA is diminishing with the generation of cards designed for it. The point is if you are going to take that hit you should get the return on investment appropriate, what is asinine is turning on AA with all that fancy quincunx etc and ending up with the wrong sample weightings even after you take the hit.

[This message has been edited by dorbie (edited 08-22-2002).]

Dorbie, great explanation. I simply could not see it before, but now its obvious. Thanks.

Originally posted by dorbie:
Humus, that 9700 AA shot you posted is 6x not 4x.

Yeah, my mistake. Fixed it now …

No dorbie, you’ve missed my point. I’m not complaining about the quality of current AA techniques, I’m questioning whether or not it is neccessary to use AA at all, in any form.
You’re a graphics programmer - it’s your business to look for ways to improve images, you look closer than consumers at images. A gamer (which is who ‘consumer’ cards are aimed at) does not look this closely at polygon bounderies, and tends not to notice jaggies anymore, when operating at a reasonable resolution - go ahead, ask a gamer his opinion on aliasing artefacts, the silence will deafen you.
Feel free to catch up? You’re very cheap Dorbie, and not a little anal.
Just wanted to clarify what I was actually saying - feel free to carry on having a conversation with yourself, Dorbie.

Originally posted by davepermen:
[b]well, vman, you’re just plain wrong… edges still get jaggy, and indepenend on resolution, the jaggyness is easily visible…

why does a dvd on the tv, like shrek, look far more detailed than any game on 800x600? its not the resolution… its just that there is more detail perpixel…

and they don’t get the pixels from the left, the right, etc… they DO render the screen in a higher res, and downsample it (directly, in hw, on my gf2mx its done in software => ultrahighresrendering, then sampled down with a box filter)…

so its the same…

the amount of pixels on the screen are enough.[/b]

True, they are not getting the final from left and right. I’m not saying that downsampling (these methods are called high resolution AA) doesn’t have its merrits, but the problem is that they tend to give a dull look in some cases where there are brightly colored pixels surrounded by lesser ones.

Doing edges AA will reduce that.

My final point about increasing monitor and video resolution was about forgetting about AA methods completely. If you increase resolution high enough, your eyes (brain) will do the merging of pixels and as a bonus, you won’t ne seeing jagginess. It would be like looking at a photograph.

I don’t know about you guys, but I can see individual pixels (the phospors) on my monitor and the spaces between them.
Oh yes Shrek. It’s hard to see the details since TV images (old ones anyway) are jumpy and the vertical resolution here is 525 lines I hear. Try watching on a computer. Big difference!

V-man

Feel free to catch up? You’re very cheap Dorbie, and not a little anal.

Anal or not, Dorbie’s right. Antialiasing is essential to photorealistic results. Sure, it may not matter for current games, but things like OpenGL 2.0 and so forth aren’t about current games. They’re about the future. And the future is photorealistic graphics.

There’s a reason that ATi put the effort into anisotropic filtering (which, btw, is a form of antialiasing) and other image-quality enhancing features. Because image quality is the future. Right now, you can run modern games on a 9700 at 1024x768 with good antialiasing and high-qualtiy visuals at acceptable framerates. No other card could do that.

In terms of gaming, this is nothing less than a revolution. Most gamers can’t play a game at much more than 1280x1024 anyway (due to monitor limitations), so dropping down to 1024x768 to get a better image is hardly an unreasonable idea. And, regardless of whether they claim to notice aliasing or not, they subconsciously notice it. You’ll never get photorealistic rendering (or anywhere close to it) without antialiasing.

AA does make a lot of sense. With rotated grid multisampling, you can achieve drastically better image qualtiy and at a lower performance cost than by upping the resolution.
And that’s the real kicker. 4xAA on a Gf4 won’t do you much good, it’s just a bad tradeoff (because it’s ordered grid), but RGMSAA is IMO the best thing since sliced bread.

Anyone with a Gf4Ti can do a little experiment:

1)Fire up a game, any game, but set it to 800x600. Look around closely
2)Activate 2x AA, then fire up the game again, this time set it to 640x480

Now, which setting is better? You already know my opinion

knackered

Screw gamers. The hardcore gamer is a halfwit who probably runs without lighting because he thinks 180fps is better than 120fps. Does that mean we don’t need hardware lighting either.

With graphics heaven is in the details and it’s the subtle stuff that makes all the difference. The stuff you don’t notice until it’s taken away.

Those of us in film, tv, science have been waiting for this stuff for way too long while the game guys get more polys that they can’t even use.

Good on you ATI, genlock next please. And knackered, drag yourself away from your computer for a couple of hours and read a book. You might learn something.

Originally posted by knackered:
A gamer (which is who ‘consumer’ cards are aimed at) does not look this closely at polygon bounderies, and tends not to notice jaggies anymore, when operating at a reasonable resolution - go ahead, ask a gamer his opinion on aliasing artefacts, the silence will deafen you.

Well, go visit a gamers forum like the ones at Rage3D or NvNews and you’ll find that AA is a feature that is quite often talked about. Gamer do really request good AA quality these days. Generally speaken, gamers talk about speed, AA and anisotropic. The rest of the attributes of a card is batch together in vague titles like “DX8 card” or “DX9 card”.

henryj,

Those of us in film, tv, science have been waiting for this stuff for way too long while the game guys get more polys that they can’t even use.

How does a hardware accellerator help in film/tv production ? All of this stuff is rendered offline as far as I know, giving artists the freedom to choose the methods they need/want.

How does a hardware accellerator help in film/tv production ? All of this stuff is rendered offline as far as I know, giving artists the freedom to choose the methods they need/want.

Yes, but now consumer-level cards are getting close to giving them what they need. With good antialiasing, good shader support, and a few other things, they can start speeding up production through hardware acceleration.

knackered, I’m not being cheap. I said AA needs gamma correction, initially you misstated what I had said about what you need to research, but I did understand your claim. It was a bit of a non sequitur you saying the emperor has no clothes on AA but your entire tone required a suitable response. You imply that everyone who appreciates the benefits of AA is a fool running around pretending it’s a good thing instead of looking for themselves, when the truth is that many industries rely on antialiasing for image quality and the difference is completely obvious, even to the casual observer. I can SEE aliasing, don’t tell me it’s not there. It is easy to think of cases where AA is vital even in games, think of lines on a runway in a flight sim game for example, or a distant target like a tank. Aliasing is an obvious distraction in these applications.

When we have card reviews online posting comparrisons of 2, 4, 6 & 16 sample AA I have some evidence that AA is seen as an important issue and quality is carefully tracked. When davepermen posts “geforce aa looks awful” bearing in mind even that is better than no aa, I can be confident that the benefits of good quality AA vs poorer AA are clear to others. When graphics cards are implementing AA with less and less of a performance penalty with each generation I think it is inevitable that this feature will become a main stream one (if it hasn’t already).

Have you cranked up your gamma correction and looked at AA quality yet? Go to display properties->settings->advanced->color correction then evaluate AA. I think you may change your mind.

[This message has been edited by dorbie (edited 08-23-2002).]