NVIDIA's GL2 drivers

I just took a look at nvidia’s new OpenGL2.0 drivers for win32.
While I’m very proud I can finally run 3Dlabs demos on my PC, I notice the lack of shading Perlin noise.

Does anyone know if it’s going to ever be implemented?
Is there a possibility the hardware won’t support it directly (I’m acually on NV43)?
Is there a chance there will be an “emulator” like nvEmulate just to have it in place, maybe at reduced performances?

I believe 3Dlabs is the only vendor that currently supports the noise function.

Does anyone know if it’s going to ever be implemented?
I doubt it. If it were up to me, supporting noise() and other such functions would be the absolute last thing on my list of things to do for glslang functionality.

Is there a possibility the hardware won’t support it directly (I’m acually on NV43)?
I, for one, would certainly not bother putting that in my hardware.

Originally posted by Korval:
I doubt it. If it were up to me, supporting noise() and other such functions would be the absolute last thing on my list of things to do for glslang functionality.
Well, you’re right for now but considering the amounts of extra processing power and bandwidth, I fear we’ll have to switch to procedural texturing pretty soon to avoid being bandwidth limited.
However, pixel derivatives are another not really well supported operation so procedural texturing in not possible anyway.

Originally posted by Korval:
I, for one, would certainly not bother putting that in my hardware.
Ok, I’ll check again… next year I think. It would take less than 400bytes of caches I guess since noise is basically a LUT, plus some extra transistors in the decode unit.
For 16bit noise, then we shall raise to 128k. It does not seem to be a very expensive thing.

Anyway, thank you both for the replies!

You can implement your own. Just search this forum for “simplex noise” and pick what you need.

That’s sure, I already implemented it on CPU so I have only a doubt.
The problem lies in encoding the permutation table in textures or uniform array. I guess most people is doing this using a texture however, being random this method will kill texture caches.
I don’t think the same applies to a uniform array, since I guess all the uniforms are loaded to fast registers so this could be a win.
Do you think I should care about that?

The point is that the performance of a “overriden” noise function could be very different from a hardware accelerated one.
I understand however for development purposes this is the only way so I guess I’ll do that but I want it to be as similar as possible (performance wise) to a hardware accelerated noise.

Well, you’re right for now but considering the amounts of extra processing power and bandwidth, I fear we’ll have to switch to procedural texturing pretty soon to avoid being bandwidth limited.
I can find stuff to do between texture lookups. Like high-quality lighting and so forth.

Besides, procedural textures don’t look good for everything. Indeed, they don’t look terribly good for most things. It’s only really a select few kinds of objects (marble, wood-grain, etc) where procedural textures provide a decent visual output.

It’s only really a select few kinds of objects (marble, wood-grain, etc) where procedural textures provide a decent visual output
ehem (instancing) cough :wink:
seroiusly though im in agreement, noise is about the last thing i wanna see the hardware do, its a gimmick.

Noise may be a gimmick, and there seems no incentive to make it more than that.

GLSL (and Cg?) define noise by characteristics, not algorithms. Which noise du jour should GPU vendors build into their compilers let alone hardware? Does it matter if Nvidia doesn’t match ATI? Maybe not for a game where framerate trumps assuming sufficient quality. Maybe yes for a production flow where initial work is hardware accelerated but final rendering is software (possibly non real time shader programs as GPU’s improve). It may matter the exact noise algorithm is known so results track through the production flow.

A framework which allowed for a variety of noise functions (algorithms, derivatives, precision) might allow compilers to substitute tuned code as dictated by market response. (Isn’t something like this used for compressed textures?) As a gimmick, none of this matters. It’s just noise. :slight_smile:

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.