i have to render hair with a polygon hair model using textures … for normal alpha blending teh polygons must bee sorted … but i can’t do it in my case … can sort every strand … any ideas???
There’s an order independent transparency technique known as depth-peeling. You can download a paper in pdf/ppt here
In the Dawn and Dusk demos we used the “SAMPLE_ALPHA_TO_COVERAGE” mode with multisampling to provide order-independent transparency at the hair tips. This was using line primitives, I’m not sure how it would look with polygons.
Originally posted by simongreen:
In the Dawn and Dusk demos we used the “SAMPLE_ALPHA_TO_COVERAGE” mode with multisampling to provide order-independent transparency at the hair tips.
This sounds interesting, I’ve never heard of this technique. Could you give us some insights on how it works?
Imagine screendoor transparency applied at the subsample level and you’ll have a pretty good idea. SGI Infinite Reality supported a similar thing. Displayable levels of transparency are limited to the number if subsamples per pixel.
In addition to that multiple surfaces at the same level usually do not interfere with each other (i.e. there is no difference if you render one or arbitraryily many transparent surfaces of the same transparency), as they affect the same samples. So essentially it’s pretty good if you expect to have only one transparent surface in every pixel, but not a general replacement for sorting. But for that it’s really nice and fast.
if the hair is built out of several “shells” (layers), you can render quickly back to front by enabling frontfaceculling, render from outer to inner shells, and then backfaceculling, and from inner to outer shells… that way, you’ve drawn back-most parts first, and front-most parts last.
depends a lot of what hair…
the way of faking order-independent transparency with supersampling and alphatesting is a funny idea:D