(Sorry for the shameless plug)
We are going to release an open source tool, called ShadeVis, for the computation of the ambient occlusion term of a generic mesh.
You can see some snapshots from the image gallery section of our site,
What are the improvments compared to ATI’s normal mapper UI ?
Doesn’t the ATI normal mapper generate a low res mesh and normal maps from a high res mesh?
It only generates a normal map. For that it need both a low-res and a high-res model. The low-res as a target for what geometry to map onto, and the high-res for source geometry and normals to generate the normal map from.
I believe what Cignoni is doing is something completely different.
The Ati NormalMapper tool take in input a High resolution mesh and a low res mesh with a good UV parametrization and build a normal map for the low res mesh that captures the detail of the high res mesh.
Usually you have to build the lowres mesh with its parametrization by hand.
The ShadeVis tool is aimed to compute a per-vertex ambient occlusion term that can be re-used during interactive rendering. There is a nice primer on this technique on gamasutra http://www.gamasutra.com/features/20040319/hill_01.shtml .
I have tried to better describe the ShadeVis tool at this page http://vcg.sourceforge.net/tiki-index.php?page=ShadeVis
thanks for the comments
ATI’s NormalMapper is able to compute a per-texel ambient occlusion term, since version 3.01.00.
And spasi got the point …
So … what are the improvments compared to ATi’s normal mapper ?
ok, probably i should have been more clearer.
we compute ambient occlusion per vertex not per texel. So you do not need for a parametrization of the surface. It is quite hard to parametrize well very large meshes (that was our starting obj).
we aim to manage large meshes. We have used the tool for computing ambient occlusion of a 8M tri mesh.
we are reasonably fast. ~40 sec for computing the ambient occlusion of a 1M tri mesh.
nVidia’s Melody can perform this parameterization. And it can generate automatically low res versions of your input (high res) model.
Still, this is a long and annoying process, but it has the major feature of supporting per pixel ambient occlusion, when used in conjunction with ATi’s Normal Mapper UI. Having this ambient term computed on a texture has another advantage, because you can decide where to use it very easily.
I can see the point of per vertex occlusion on gigantic meshes, but it presents imho little interest for 5k to 50k meshes that heavily use per pixel lighting.
Yet, the pictures are very nice.
I obviously know that there are tools for automatically creating a parameterization for a generic mesh and that obviously there are tools for simplifing meshes. Joking apart i never meant to say that per vertex ambient occlusion is the ideal solution for a lowpoly textured mesh.
But for rather large meshes (i.e. in the order of Mega-tri and not gigantic that are the ones that near to Giga-tri), where you have a lot of unstructured messy geometry, texturing all these triangles could not be the optimal solution. So storing ambient occlusion on vertexes is a cheap and easy solution and it seemed to me that there was not such a tool around. This was our objective and i thought that it was clearly stated in the web page.
the used approach is quite simple (trivial i should say) just ortho-render the scene, get the zbuf and check if each vertex is occluded from that direction.
A final note we wrote the first version of this stuff 4 years ago (we published something on this in 2001) slightly before that, if i well remember, the ambient occlusion hit the large public in 2002 with a siggraph renderman course. In any case i think that the technique was originally proposed in
S. Zhukov, A. Iones, G. Kronin, An
Ambient Light Illumination Model, proc.
Eurographics Rendering Workshop '98