I decided to modify my shader so it writes in gl_FragDepth. I was very unhappily surprised to notice my scene now looks like I disabled anti-aliasing.
I tried thinking of any cause that would create such an artifact without any success. I tried in many shaders and always got the same weird results: when I write to gl_FragDepth, my scene is exactly the same as if I disabled AA.
I’m using a GeForce FX 6800 Ultra.
Anybody saw something like that?
Well, I think that it to be expected to at least some degree. If you are writing a depth value for the fragment, you are replacing the multiple depth values that would normally be used in the multisample depth test. I would however expect that you could still have antialiasing on the coverage mask for the pixels.
Shouldn’t the fragment program be executed once for each sample when multisampling is activated?
A simple test would be to write the depth value unmodified, or perhaps manipulate it a bit like multiplying by a uniform which you can set to 1 to fool the driver optimizer.
I guess the trouble comes from the hw duplicating the same depth values for all the samples.
If it was simple a matter of having the same depth value for all samples, don’t you think disabling the depth test would hide the problem?
Possibly yes and possibly no. It would still have to write depth values. I say “possibly” because I have no insider information so the best thing to do is try.
This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.