cloud shader

this isn’t a strictly opengl question, but i thought i would ask, and this board has been kind about lazy posting so far.

so anyhow, i’m pretty sure i remember reading about a demo that did a cloud shader by encoding the aproximate density of the cloud from various vantage points on a per vertex basis.

so basicly you would send in extra vertex attributes, which would act as a LUT of the density of that vertex… that is how thick is the cloud as seen through so many angles.

then you interpolate the actual viewing angle between the sampled angle densities to get the final blending factor for that vertex.

so i guess i’m curious if anyone else has heard of or thought of doing this? is this common practice? any suggestions, creative or not, on how to go about uploading the per vertex view dependant density LUTs?

like i said, this isn’t an opengl technical questin… its more algorithmic, but might diverge at some point into opengl.

you basic options at the top level as i see it is whether to put the tables in a texture and upload coordinates, or to just try to stuff the LUTs in the vertex attributes.

from there it is just a question of data alignment and encoding, and maybe questioning whether it is appropriate to take the likelyhood of a particular vertex being back facing into consideration.

sincerely,

michael

PS: if this is an inappropriate topic for this board, please let me know.