I am working on Android, where I am restricted to OpenGL ES and seldom have multisample buffers available.
The problem is antialiasing polygon edges… surely a very common issue. The solution I initially used was to use transparent texture edges (i.e. let bilinear filtering provide smooth edges), but this requires drawing the scene in back-to-front order. Since my scenes tend to have a lot of overdraw, this totally killed my framerate.
I am now wondering whether antialiasing can be achieved in a fragment shader? For it to work the shader would need a “%age of fragment covered by polygon” variable, and blend according to that.
Is such a thing possible? Can fragment shaders have this kind of information? I have not yet written any shaders… I’ve only done fixed function GL work up to now.
Thanks for any advice!
The problem is antialiasing polygon edges… surely a very common issue.
Typically this issue is dealt with by simply accepting the aliasing. If you don’t have access to multisampling, then any other solution is probably not going to be worth the performance to implement.
If the gpu doesn’t have multisampling, simulate it manually by: draw the scene 4 times, each time offsetting the viewport (actually projection-matrix) by some sub-pixel amount. Then combine the 4 resulting textures.
Another way is to draw the edges as 3-px wide lines, with biased depth , and use a fragment shader there to set alpha= gamma_correct(1 - distance(gl_FragCoord.xy, lineCenterVarying));
But it looks nice only rarely, and doesn’t handle intersecting geometry.
Only latest gpus provide coverage info to fragment-shaders.
Maybe you can adapt the GPAA method to GLES. It requires no multisampling, but you have to render the geometry twice to find the polygon edges.