diffuse bumpmapping with ARB_vertex_program

Good day folks.
As a new visitor and recent user of ARB_vertex_program, I’m facing a problem while transfering the calculation of a diffuse bumpmapping effect from the CPU to the GPU.

In the previous version of my work (without any vertex program) the effect is realized on the CPU using a classical combination of normalizing cube map, normal map and base map. The light is transformed into object space. Then the L vector from vertex to light is projected into the local space and normalized by the cube map, then again dot3-multiplied with the normal map.
Nothing but classical.

It works just fine. But the vertex program I’m now writing for the same purpose is not bound to render the fine effect. Actually the object renders but the bumpmapped light effect reacts as the only camera position changes ! You may say I have a matrix problem, but I can’t find what in the world differs in the way I compute light position and normals form the old way (on CPU) to the new one (on GPU).
If I’m allowed to, I’d like to write down the process of my GL calls preceding the glDrawArrays(), then followed by the guilty vertex program.

here is the process:

  • centering the modelview matrix to the object
  • transforming light position into object space (using inverse modelview matrix)
  • passing this relative position to the GL (glColor(GL_LIGHT0, GL_POSITION, relativeSrcPos))
  • pointing to the vertex array
  • pointing to the normal array
  • pointing to the texCoord for TMU0 (cube map)
  • pointing to the texCoord for TMU1 (normal map)
  • pointing to the texCoord for TMU2 (base map)
  • pointing to S and T arrays as generic parameters 11 and 12 of the vertex program
  • glDrawArray

And here is my guilty vertex program:

"!!ARBvp1.0
OPTION ARB_position_invariant;
ATTRIB iPos = vertex.position;
ATTRIB iColor = vertex.color;
ATTRIB iNormal = vertex.normal;
ATTRIB iTexCoord0 = vertex.texcoord[0];\ #cube map
ATTRIB iTexCoord1 = vertex.texcoord[1];\ #normal map
ATTRIB iTexCoord2 = vertex.texcoord[2];\ #base map
ATTRIB coordS = vertex.attrib[11];\ #generic parameter 11 (S-array element)
ATTRIB coordT = vertex.attrib[12];\ #generic parameter 12 (T-array element)
PARAM lightPos = state.light[0].position;
TEMP vertexToLight;\ #L vector
OUTPUT oColor = result.color;
OUTPUT oTexCoord0 = result.texcoord[0];
OUTPUT oTexCoord1 = result.texcoord[1];
OUTPUT oTexCoord2 = result.texcoord[2];
\

compute the light L vector


SUB vertexToLight, lightPos, iPos;
DP3 vertexToLight.w, vertexToLight, vertexToLight;
RSQ vertexToLight.w, vertexToLight.w;
MUL vertexToLight.xyz, vertexToLight.w, vertexToLight;
\

compute the cube map texture coords

\

projecting L onto S, T and N


DP3 oTexCoord0.x, coordS, vertexToLight;
DP3 oTexCoord0.y, coordT, vertexToLight;
DP3 oTexCoord0.z, iNormal, vertexToLight;
\

nothing else


MOV oColor, iColor;
MOV oTexCoord1, iTexCoord1;
MOV oTexCoord2, iTexCoord2;
END";

(As you can see I ignore vertex calculation)
To me, as the GPU enters this program, normal as well as light position should be considered in the object space, therefor no transformation should be needed.
But I must have a matrix problem or something, the bumpmapped effect reacts to a slightly camera rotation. May I miss some rule in the transfering process of vertex attributes ?
I read and read tutorials and examples but can’t find what does mess my once-beautiful effect.

I thank you for your attention and your any ideas.

john_john

The only think I can think of,is something wrong with the inverse matrix.Try setting lightpos with glLight using the identity matrix,and see what happens.

Argl ! I would die ! You’re terribly right. I mistook myself when I set the light AFTER altering the modelview matrix. I just forgot that the GL computes the light position just as it does with a point, which explain the reaction to a camera move.

I almost went crazy because I used the relative light position for a bunch of other calculations (diffuse and specular by the CPU, projected shadows …) and all rendered fine. And that’s obvious because I used the light position my way. When passing this position to my vertex program I didn’t pay attention to the calculation made in the same time by the GL, and that’s why all went wrong.

glLight must be called after a glLoadIdentity.

Thanks a lot guy !

john_john