glGetAttribLocation returning -1

Hello OGL Experts :slight_smile:

I’m currently warming up to OGL3.2 / GLSL 1.5 in C#, using OpenTK Wrapper which is close enough to the real API to make me think my issue is more an OpenGL issue than an OpenTK one.

I have created a small vertex shader & fragment shader. My vertex shader has 3 input (color, normal, and of course, vertex) and 2 output (color, normal), which are used in my fragment shader.

My issue, in short, is that the following code (after linking of the “ShaderProgram”, to which I attached my vertex and fragment shader)

int positionLocation = GL.GetAttribLocation(pipeline.ShaderProgram, "in_position");
            int colorLocation = GL.GetAttribLocation(pipeline.ShaderProgram, "in_color");
            int normalLocation = GL.GetAttribLocation(pipeline.ShaderProgram, "in_normal");

returns :

positionLocation = 0
colorLocation = 1
normalLocation = -1

The shader program compiled successfully, and glError contains “no error”…

The vertex shader program is as follow:

#version 150

precision highp float;

uniform mat4 projection_matrix;
uniform mat4 modelview_matrix;

in vec3 in_position;
in vec3 in_color;
in vec3 in_normal;

out vec3 normal;
out vec3 color;

void main(void)
  //works only for orthogonal modelview
  normal = (modelview_matrix * vec4(in_normal, 0)).xyz;
  //send color information to fragment shader
  color = in_color;
  gl_Position = projection_matrix * modelview_matrix * vec4(in_position, 1);

So, a quite simple shader, and of course “in_normal” is there…

Also, if I don’t call EnableVertexAttribArray(normalLocation), then the code runs fine, the vertex and colors are correct. Of course, not the normals…

Is there anything I need to set up to have more than 2 attribs?


This can only happen if in the frag-shader you’re not really using the “vec3 normal”. Compilers can optimize-out its use, like in:

  glFragColor = vec4(normal,1);
  glFragColor = vec4(1,2,3,4); // now "normal" gets optimized-out

1 Like

Ilian is right, you are using “in_normal” to compute “normal” but then “normal” is not used for the computation of vertex shader output data. So the clever compiler just has discarded it at compilation time!

Thank you Ilian and Dletozeun!

Are you saying I got outsmarted by a Shader Compiler ?! :slight_smile:

I’m not sure I understand : how am I then supposed to give data to that supposedly useless in_normal? Of course, I only use it in the Vertex Shader, not in the Fragment shader, but “normal” is used in the Fragment shader for (poor) light computations, like this :

#version 150

precision highp float;

const vec3 ambient = vec3(0.1, 0.1, 0.1);
const vec3 lightVecNormalized = normalize(vec3(0.5, 0.5, 2));
const vec3 lightColor = vec3(0.9, 0.9, 0.7);

in vec3 normal;
in vec3 color;

out vec4 gl_FragColor; 

void main(void)
  float diffuse = clamp(dot(lightVecNormalized, normalize(normal)), 0.0, 1.0);
  gl_FragColor = normalize(vec4(color + ambient + diffuse * lightColor,1.0));

So, in the fragment shader, I use “normal” … and since “normal” is an input, it must come from the vertex shader, and it’s computed there from the “in_normal”. In that regard, WHO is that compiler to flag “in_normal” as useless? :slight_smile:

More importantly, what’s the key difference between “in_color” and “in_normal” ? “in_color” is even less used in the Vertex Shader, but it gets a proper location… :frowning:

The code looks alright, and “normal” is really used :S. Maybe we’re missing something.
I generally use glBindAttribLoc instead of letting the linker decide.


Out of sheer desperation, I added this before Linking my shader:

GL.BindAttribLocation(shaderProgram, 0, "in_position");
GL.BindAttribLocation(shaderProgram, 1, "in_color");
GL.BindAttribLocation(shaderProgram, 2, "in_normal");

And now, normalLocation gets the proper value, which is 2.

I feel that somehow, this shouldn’t be necessary, a linker should be able to count inputs from 0 to 2 :slight_smile: Is that some kind of bug in the linker? I thought I read somewhere in the specs that if no binding has been done, one will be provided anyway.

Thanks for the hint!


Oh my mistake! Sorry. :frowning: I missed the “in vec3 normal” … So “normal” is used actually.

Binding an attribute name to a generic attribute index does not mean that your shaders are ok. Actually, you can bind any attribute name to a generic attribute index, including names that are not used in any vertex shader. Thus, the fact that Opengl did not bind the “normal” attribute is either a driver bug or something is wrong in the shader and Opengl has thrown a compile or link error. Have you checked that?