I am trying to learn about writing a displacement shader. And I began with the ozone3d tutorial. However the following vertex shader does no displacement even though its almost verbatim the shader from the ozone3d example. I am trying this on an ATI 3650 card under windows vista.
uniform sampler2D EarthNight;
gl_TexCoord.xy = gl_MultiTexCoord0.xy;
dv = texture2D( EarthNight, gl_MultiTexCoord0.xy );
df = 0.30 * dv.x + 0.59 * dv.y + 0.11 * dv.z;
newVertexPos = vec4(gl_Normal * df * 100, 0.0) + gl_Vertex;
gl_Position = gl_ModelViewProjectionMatrix * newVertexPos;
Any pointers on anything obvious I might be missing.
Could be any number of things wrong, but nothing really grabs me in the snippet you posted.
Getting any errors? Check your shader/program logs? Right normal supplied? Need to transform it? Texture bound? Texcoords right? …
Sometimes a nice debugger can be helpful in tracking down the stuff that slips down into the cracks. Check out e.g. glIntercept or glslDevil. They’ll report any errors and let you view your textures and such.
make sure your texture params are set up like this:-
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
and make sure you use one of the following 2 internal formats:-
oh, and smile! You’re on ATI hardware!
I forgot to mention that I was trying this shader under rendermonkey as I thought that would be the quickest way to learn / prototype shaders. I will write a sample program to use the same shaders so that I can use glsldevil.
I created a sample program with the shader. And unfortunately it seems glsldevil will not allow you to debug / set break points / create watch on variables on ATI cards. Do I really have to buy an Nvidia card to do opengl development ?
you pretty much need to be able to program to do opengl development. If you’re using glsldevil or rendermonkey, then you’re not really an opengl developer.
come back once you’ve bought a compiler.
In a gnu world the compiler buys you.
I will ignore the part about what constitutes “true opengl development”.
I did say that I wrote a sample program which loads the shaders. The shader still doesnt behave the way I want it to on Windows and Linux using my sample program. Well and when a program doesnt do what you want it to do - you reach out for debuggers. A search for glsl debugger on google brings up glsldevil as the most promising candidate. Honestly I would be interested in knowing how you would do things differently if you were starting out.
Strangely enough the shader does semi work on a mac - no displacement but at least df gets non zero values. But Shader Builder on the Mac has some other issues for which I filed a bug report today.
in that case, did you follow my instructions on setting up the texture parameters and format? You didn’t say.
glsldevil is rubbish by the way. It just doesn’t work. Lovely GUI though - they’ve obviously put a lot of effort into the GUI.
"glsldevil is rubbish by the way. It just doesn’t work. Lovely GUI though - they’ve obviously put a lot of effort into the GUI. "
works fine for me here on Nvidia. It’s probably the most useful tool I’ve got. And I am an opengl developer btw!.
This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.