Hardware noise() support?

Does anyone know what cards and drivers (if any) actually support GLSL noise() in dedicated hardware, or at least evaluate it as a proper procedural function?

I have tested a few combinations I had available at work and at home, and even though noise() is compiled and executed OK on recent hardware with the latest drivers, it always returns zero, like its Cg counterpart.

The GLSL “noisy” demos I have found all seem to use a texture lookup for noise values, but I want true procedural noise, not a Sampler3D cheat.

I might have tried this on too old hardware, but in that case I want to know what card to buy. I want real, procedural noise!

Stefan G

currently no Nvidia nor ATI hardware supports noise() in hardware, I cant speak for the 3DLabs cards however

I have a 3DLabs WildCat Realizm 100, and seems to have HW noise support.
this program:

varying vec4 pos;
void main()
gl_Position = ftransform();
pos = gl_Vertex;

varying vec4 pos;
void main()
gl_FragColor = vec4(noise3(pos.xyz),1.0);

gives me this image, and with a good framerate:

I’m surprised that they don’t even put in a fixed 16x16x16 rgba texture to perform noise() look-ups when you use that function. That would mean that they have to fall-back to returning 0 (or the input!) for noise() if you use all texture image units, but that’d probably be OK…

So who has the biggest badest GPU now?
It sounds like 3dlabs tends to support GLSL fully which is beyond SM3 support.

If it has native support for all the intrinsic functions …
Probably not all, but if they have sin, cos, tan, and some others.

They should print some details like :

noise 1 : native support
noise 2 : native support
noise 3 : native support
noise 4 : native support
noise 1 : native support
reflect : native support
normalize : native support
pow : native support

Wonderful news! Looks like I will spend some of my money with 3DLabs. True procedural hardware noise() isn’t really that difficult or resource demanding to implement, so I’m surprised to see that so few manufacturers implement it. I had the impression that NVidia would be first, since Cg had a noise() function, but I guess they got sidetracked.

It sounds like 3dlabs tends to support GLSL fully which is beyond SM3 support.

But their hardware supports only DirectX 9 VS 2.x and PS 3.x.

True procedural hardware noise() isn’t really that difficult or resource demanding to implement
I don’t think there is hardware for that, it can also be implemented as a shader function.
The difficulty there is that for real good noise you can spend a lot of instructions and maybe a texture unit. You can either implement a good noise function or a fast and crappy one.
To get a cross platform equivalent implementation and the required quality of noise distribution, applications should taylor their own noise function to their needs. Just my 2c.

What’s missing in their vertex shader, texture fetches?

I’d guess so, I don’t have a 3Dlabs board. The versions were taken from their tech docs of the Realizm.

as i understand it, noise chips to put on the GPU are subject to a patent, thus 3DLabs have probably had to pay for it’s inclusion. Its about time each vendor had this function as texture based noise is V annoying and slow. Implementing a decent noise algorithm thats fast is almost impossible due to texture lookup overheads.

There are some examples of how to do this on shadertech.com. However if you want per vertex noise, you can pass in the perm and grad tables (im assuming perlin noise here) and calculate procedural noise in the vertex shader in a sensible amount of time.

Im only interested in fragment level noise, so cant wait until this is put accross all vendors.

If you are looking for per-vertex noise, a uniform array for perm and grad tables is the way forward

I am also most interested in fragment level noise, although vertex level noise is also interesting for “fractal” (fBm) objects like terrain.

I assume the patent mentioned is Ken Perlin’s, but I would very much doubt that he is unwilling to give hardware vendors a good deal on patent rights. I strongly suspect he is also eagerly waiting for hardware noise to happen. It’s been several years since he proposed it in his article “hardware noise” at Siggraph.

It does not take many transistors or a lot of effort to implement fast and good looking 4D Perlin noise in hardware. Ken Perlin’s most recent algorithms are very hardware friendly indeed.

Yes i was referring to perlins hardware noise. I too cant wait for somebody as well as 3d labs to implement this (NVIDIA, you are pretty good at bringing these things out quick, ATI you usually follow)

I am expecting with fast noise to see a whole new array of richly procedurally textured scenes, it would be of great benefit and a huge step forward.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.