Does ATI support the hardware noise function

Interesting quote from the spec :slight_smile:

Though in theory Iā€™m thinking you should still be able to check for or expect this kind of feature in the same way you might expect any other feature from say Mount Evans to be implemented in hw (which at the end of the day is just another version of GL, though perhaps unique in the sense that itā€™s predicated on a minimum hardware requirement). Mount Evans smells different to me in that respect. But could be Iā€™ve misinterpreted thingsā€¦

Anyway I really like the minimum hardware requirement thingā€¦ you really get a sense of where youā€™re standing with that.

P.S. Iā€™ve got to get me one of those leaping lizards :slight_smile:

Iā€™d use a precalculated texture anyway even if performance was good simply because of mipmapping, which solves many hard aliasing problems that youā€™d get with a plain noise() function. I suppose noise() could be useful in the vertex shader though.

While mipmapping is indeed an easy way out, it is certainly not the only way. Antialiasing of purely procedural noise is perfectly possible, and it has been done in off-line rendering for ages. All you need is good partial derivatives to compute how big a fragment is in the texture coordinate domain, and clamp the noise components which are too high on detail for the current view. Noise is particularly easy here because of its band-limited nature.

3D noise is intractable to store in a precomputed texture, and 4D noise is impossible. Iā€™d say procedural noise has a pretty clear application even for fragment shaders, and it is being used extensively in off-line rendering.

Nvidia has good and fast Dx() and Dy() functions. ATI had some problems with that before (Dx and Dy were unsupported). I donā€™t know the status for current ATI hardware.

And while we have been looking the other way, procedural noise in a fragment shader is in fact not as time consuming as one might think, even if you emulate it in a shader function. My simplex noise demo was written two years ago, and it had pretty good performance on an FX6800. Hardware certainly has become much faster since, and keeps improving.

Itā€™s supported in the X1000 series and up.

Humus I am curiousā€¦ What solution would you suggest for getting noise in a vertex shader on ATI cards to run fast / faster.

Obviously noise (using a noise texture) and textures generally run just fine in the fragment shader, but as I am on an X series card I cannot use either the noise function or a texture in the vertex shaders to accelerate things thereā€¦

I am playing with a shader set at the moment that uses noise and several textures, and I require access to some kind of noise in the Vertex Shader.

I get around 600 - 800 fps using noise3 in the vertex shader. But thatā€™s the best I can get on a fairly low res gluSphere for testing! Not earth shattering!

When I tried to use a noise texture it actually got slower, and I found out at that point about the whole Vertex Shaders / R2VB discussion.

I canā€™t actually see how R2VB would help here. Am I missing something?

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.