GPU-Executed Perlin/Simplex Noise in Vertex Shader

Anyone know of a guaranteed GPU-executed Classic Perlin or Simplex noise technique that can be used in the Vertex Shader? All the methods I’ve so far seen involve either texture lookups in the VS (which is likely to force software-fallback on non-nvidia hardware, as I understand it), and/or large arrays of pre-generated numbers (which I imagine is inefficient because GLSL doesn’t support constant array types).

Any thoughts, anyone?


You could try something like chapter 26 of GPU Gems 2 for the basic shader, then replace the texture samplers with something vertex friendly (tex2Dlod?).

I guess you’ve looked at Stefan Gustavsons simplex/perlin implementation already?!

Hm, I think OP is after some implementation that doesn’t use textures at all in the VS (not just an implementation that works in the VS). Although the optimizations mentioned in chapter 26 might be worthwhile looking at in any case.

I’ve never myself seen or heard about a simplex/perlin implementation not based on textures - it seems unavoidable for the pseudo-random numbers at the very least.

If you don’t consider VTF an option at all, then you might have to make do with precomputed noise attached as attributes to each vertex. Or?

However, VTF does work satisfactory on ATIs newer hardware (2600 series and later?) and obviously also on NV since the GF6800 series and later. And you would want to stay with NEAREST point filtering for the permutations anyway, so the restrictions on the texture formats for GPUs that are only SM3, just means you’ll be using a bit more memory for textures that are already small.

It should be possible to do perlin noise without texture sampling, with a basic algo such as : take the seed(s) from the texture coordinate(s), do a few ‘middle/square’ iterations, there you get the random value. You have to do the bilinear interpolation yourself however.