Strange behaviour of reflections&refractions in OGLSL

maybe it is just sake of monday morning,
but i am getting a lot of trobles using OGSL. I had two little vp and fp written in assembler, which were simulating specular reflections using an env cube map. Now i wanted to do the same stuff with OGLSL, i tried to “translate” my assembler, but surprise surprise, it is acting quite weird…
On curved surfaces, the reflection is quite correct, deforming when it is expected and shifting in a convincing way. On flat surfaces it is screwing all the times:

it stretches, it zoooooms a lot (just a few pixels are displayed on a large “mirror”), it seems to be projected in perspective (so that the image seems to “escape” it the distance, regardless of the flat mesh where they should be mapped).

my vertex program is:

// uniform parameter, used to calculate che eye vector
uniform vec4 Camera;

// varying variables passed to fragment shader
varying vec3 PixColor;
varying vec3 TexCoord;
varying vec4 Eye;

void main (void)
{
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;

vec4 eye_temp = normalize (gl_Position - Camera);
//Eye =  normalize (eye_temp);

//Eye = vec4 (0,0,1,1);

//vec3 tnorm  = normalize (gl_NormalMatrix * gl_Normal);

//TexCoord = refract (eye_temp.xyz ,gl_Normal, float (0.66));
TexCoord = reflect (eye_temp.xyz ,gl_Normal);

}
  

and my fp is:

 
// texture units incoming
uniform samplerCube map;

// varying variables passed to fragment shader
varying vec3 PixColor;
varying vec3 TexCoord;
varying vec4 Eye;


void main(void)
{
    
   vec4 texcolor = textureCube (map, TexCoord);

   gl_FragColor = vec4 (texcolor);


   //  just to  check output
   // gl_FragColor = vec4 (1,0,0,1);
   //   gl_FragColor = vec4 (FinalTexCoord,1);
}

 

where this code is wrong? on assembler the very same algo was working fine…

If you suspect perspective problems, I’d worry about the w coordinate in the normalization.
Try if using vec3 eye_temp works better.

uniform vec3 Camera;
vec3 eye_temp = normalize(gl_Position.xyz - Camera);
TexCoord = reflect(eye_temp, gl_Normal);

Make sure the camera and eye coordinates are in the correct systems.

i changed my vertex program as follows:

since the TexCoord thing is referred to a cubemap, the coordinate should be the normal expressed in which set of coordinates?

// uniform parameter, used to calculate che eye vector
uniform vec4 Camera;

// varying variables passed to fragment shader
varying vec3 PixColor;
varying vec3 TexCoord;
//varying vec4 Eye;

void main (void)
{
// final vertex xform
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;

// vertex coord in eye coords
vec4 tempcoord = (gl_ModelViewMatrix * gl_Vertex);
vec3 eye_temp = normalize (tempcoord.xyz - Camera.xyz);


// normal in eye coord...
vec3 tnorm  = normalize (gl_NormalMatrix * gl_Normal);


// reflected vec (it should be in eye coord too)
vec3 prova = reflect (eye_temp ,tnorm);

TexCoord =  prova;

}
  

it still won’t work, i dunno why… on one side of the mesh looks almost right (just almost anyway). If i rotate the camera on the back side, it is all screwed, like the Texture coordinates exploded in all directions and start pointing in random directions…
what can it be?

I think the problem lies in the camera.
If you use eye coordinates the camera is always in the origin.
It should look like this:
vec3 eyeToVert = normalize((gl_ModelViewMatrix * gl_Vertex).xyz);
reflectionVector = reflect(eyeToVert, normal);

Originally posted by Relic:
I think the problem lies in the camera.
If you use eye coordinates the camera is always in the origin.
It should look like this:
vec3 eyeToVert = normalize((gl_ModelViewMatrix * gl_Vertex).xyz);
reflectionVector = reflect(eyeToVert, normal);

u were almost right…
it was a matter of coordinate space non uniform…

anyway in GLSL may i pass other formats from
vp to fp rather than the varyng things?
thx the G

No, the only interface between VP and FP are the standard FP gl_-built-ins and user defined varyings.
This is a sparse resource (16 * vec4 or so), if you exceed it, the program won’t link.
Varyings can have the usual types.
What do you have in mind?

well i had in mind to build a fresnel material shader, with color attenuantion. I’ve found some examples from gamedev, wich use cg and compute all fresnel coeffs in the vp and the pass em to the fp using uniforms. I was just wantering if i were able to make the same stuff with GLSL.
I’ll study the problem a bit longer and see if i can outflank the problem :slight_smile:

thanks anyway, Relic, your help was determinant :slight_smile:

edit:

ok here is the problem:
i want to make a fresnel effect support, so i need to pass from the vp to the fp the value:

reflectionFactor = fresnelBias + fresnelScale * pow (1 + dot (I,N),fresnelPower);

which i must compute in the vp, since the dot(I,N) part is vertex dependant.
This value anyway is constant, so i don’t want it to be interpoled as a varying thing… In ASM i’d pack it into an unused set of TexCoords and send it on the other side, but in GLSL?!?

thax the G.

In GLSL the compiler finds an unused vector for you. That’s what the “varying” with the same name in vertex and fragment programs are for. You already used it.
I didn’t quite follow where dot(I, N) should be constant, it actually changes per pixel normally.
If you want to put a constant directly into the fragment program, define a uniform in the source, get the location and fill it from the application.
Check out the Refractive Dispersion demo here:
http://download.nvidia.com/developer/SDK/Individual_Samples/samples.html

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.