Shader debugging: problems passing vertex normals

Hello, OpenGL forums.

I’ve been trying to get acquainted with some of the newer OpenGL features (3.2 core) such as VBOs, vertex attributes and GLSL - and have been fairly successful at doing so. I have, however, run into a problem regarding the passing of vertex normal data to the shader program. The program I’m working on loads a simple polygon mesh from a file and renders it to the screen (no texturing is involved as of yet).

Please take a look at the following code and see whether or not my OpenGL call flow has mistakes or logic errors of a fundamental nature:

I’m using a very typical VN interleaved vertex structure aligned to 32 bytes (the actual size of the struct is exactly 32 bytes as well).


unsigned short *indices = NULL;
indices = loadModel("pharaoh.bobj", &VBOid);

In loadModel(), a new vertex buffer object is created like so:


glGenBuffers(1, VBOid);
glBindBuffer(GL_ARRAY_BUFFER, *VBOid);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertex)*vertexcount, &vertarray[0].vx, GL_STATIC_DRAW);

also, an “index buffer object” is created:


glGenBuffers(1, &IBOid);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, IBOid);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(unsigned short int)*facecount*3, indices, GL_STATIC_DRAW);

// the indices variable in the line above is a temporary variable of type 
// unsigned short int *, and is returned by the function.
   

Shader-related stuff:

 
// Load and compile shaders, create program, attach shaders, check for errors
glBindAttribLocation(programHandle, 0, "position");
glBindAttribLocation(programHandle, 1, "normal");

glLinkProgram(programHandle);

Querying the vertex attribute locations with glGetAttribLocation() gives the expected result,
given that the input variables are actually used in the shader programs (and not optimized out by the GLSL compiler)

OK. When it comes to actually rendering the mesh, I’ve used the following code:

 
#define BUFFER_OFFSET(i) ((char *)NULL + (i))
...
glBindBuffer(GL_ARRAY_BUFFER, VBOid);
glEnableVertexAttribArray(0);
glEnableVertexAttribArray(1);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, sizeof(vertex), BUFFER_OFFSET(0));
glVertexAttribPointer(1, 3, GL_FLOAT, GL_FALSE, sizeof(vertex), BUFFER_OFFSET(3*sizeof(float)));

glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, IBOid);
glDrawElements(GL_TRIANGLES, facecount, GL_UNSIGNED_SHORT, BUFFER_OFFSET(0));

glDisableVertexAttribArray(0);
glDisableVertexAttribArray(1);

This actually results in a correct geometry, but even with a simple fragment shader (such as the following one) I cannot seem to confirm that the normals are actually being passed appropriately:

 
#version 150
 
precision highp float;

in vec3 normal;

const vec3 ambient = vec3( 0.1, 0.1, 0.1 );
const vec3 lightVecNormalized = normalize(vec3( 0.5, 0.5, 2.0 ));
vec3 lightColor = vec3( 0.1, 0.7, 0.6 );

out vec4 out_frag_color;
 
void main(void)
{
	float diffuse = clamp( dot( lightVecNormalized, normalize( normal ) ), 0.0, 1.0);
	out_frag_color = vec4( ambient + diffuse * lightColor, 1.0 );
}

Result: flat, uniform, grey shading (see photo below), which is obviously contradictory, since the fragment shader output color should be determined based directly on the vertex normal vectors.

Now, what could possibly be the cause of this?

I take it your statement

in vec3 normal;

is for the normal out of the vertex shader; but this is also the name used for attribute into the vertex shader. They can’t be the same

Brilliant! It worked - thank you very much.

I simply changed the identifiers of the vertex attributes so that they wouldn’t clash with any of the predefined ones.

Also, I had to change my vertex shader program to output the normal vectors, which were in turn taken in by the fragment shader as an input variable.

Here’s a picture of the correct shading:

Now, if only I could get the geometry shader working… :slight_smile:

What problem do you have with the geometry shader?

Well, I was trying to adopt some of the ideas presented on this page (and especially the Normal Plots-section, which I found particularly useful given the original problem) and since managed to figure it out by myself - thanks for asking though!

The solution was simply to render the primitive as GL_POINTS instead of GL_TRIANGLES (which is clearly stated in the article as well :P).

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.