BindAttribLoc work, GetAttribLoc doesn't

Hi OpenGl Community!

I’m working with GLSL shader and decide to get attribute location after the link instead of bind it.

So I’ve just translate my code

glAttachShader(_shaderProgram, vertexShader);

// prepare varing for future binding to VBOs
glBindAttribLocation(_shaderProgram, _pointIndexAttr, "in_Position");
glBindAttribLocation(_shaderProgram, _sizeIndexAttr, "in_Size");
glBindAttribLocation(_shaderProgram, _colorIndexAttr, "in_Color");

glLinkProgram(_shaderProgram);

To this:

glAttachShader(_shaderProgram, vertexShader);

glLinkProgram(_shaderProgram);

_pointIndexAttr = glGetAttribLocation(_shaderProgram, "in_Position");
_sizeIndexAttr = glGetAttribLocation(_shaderProgram, "in_Size");
_colorIndexAttr = glGetAttribLocation(_shaderProgram, "in_Color");

And this doesn’t work! o_O

I mean, the first code show something and the second show nothing…

I really don’t understand why a simple thing that only change value after the link instead of before can create a such problem. I’m sure I’ve made something stupid.

I never call hard coded number for shader attribute (I always use “_pointIndexAttr”).

This break my head you can’t imagine…

Thanks in advance for help!

Regards,

Dorian

You’re going to have to post more than that. For example, if you have multiple programs, each program will have its own set of attribute locations.

Thanks for respond and sorry,

The whole code is here:

http://code.google.com/p/coral-repo/source/browse/coral/coralUi/src/DrawPointNode.cpp

The binding is made at line 188 in initShader().

I just do the little modification I write I my first post and it didn’t work…

I think it’s crazy as, in anyway, _*IndexAttr variable are suppose to be the index value of the their own attribute…

Thanks in advance!

Just make you program active before querying locations. Linking does not activate program.

Thanks a lot for answer!

So I need to set:

glUseProgram(_shaderProgram);
// querying locations
glUseProgram(0);

…in initShader(). Ok!

But my are locations will stay correct once in the draw()? When I call glUseProgram(_shaderProgram) function again?

I’m supposed to query location at each frame? o_O

Thanks in advance! :slight_smile:

EDIT: It seems you don’t need to activate shaders to do query attributes: http://stackoverflow.com/questions/2211910/problem-mapping-textures-to-vbo-in-opengl

No! Unless you relink your program.

Ok, thanks! :slight_smile:

What I don’t understand is that there is no place where it’s written that you need to call glUseProgram before glGetAttribLocation. It’s only write that this as to be done after glLinkProgram:

http://www.opengl.org/registry/doc/glspec42.core.20110808.pdf

Can someone told me where I am suppose to know that? :frowning:

You are looking at the wrong document. :slight_smile:
GetAttribLocation is described in GL spec.

Anyway, it seems that link also bind the program. So, it is not really necessary to cal UseProgram, at least on NV drivers (I’ve just tried successfully). But if you are reading attributes location just after linking, why don’t you set them with BindAttribLocation?

Thanks!

So I need to find the good spec file to look in.

I have a ATI card, that’s maybe why there is some differences about this, even on Internet.

Yes, I could set them with BindAttribLocation but I wont to define indexes…

I use attribute mat4 value (you know, the attribute that have +1, +2, +3 once you get it) and as I’m not very familiar with it, you can do mistake if you bind them:

matrixAttr = 0
anAttr = 1
anotherAttr = 2

This is invalid because as matrixAttr is a mat4 attr, you will have 0, 1, 2, 3. and anAttr and anotherAttr are 1 and 2 so they are in conflict with it…

I won’t to lost my time with this I try to do simpler as possible and I was thinking use GetAttribLocation was the simpler way as it’s OpenGL that choose which index set to attributes.

Anyway, if you have good reading about how deal simply with manually binded attributes I would very graceful if you send them to me. :slight_smile:

I just test to do:

glAttachShader(_shaderProgram, vertexShader);

glLinkProgram(_shaderProgram);

glUseProgram(_shaderProgram);

_pointIndexAttr = glGetAttribLocation(_shaderProgram, "in_Position");
_sizeIndexAttr = glGetAttribLocation(_shaderProgram, "in_Size");
_colorIndexAttr = glGetAttribLocation(_shaderProgram, "in_Color");
glUseProgram(0);

And it doesn’t work. And:

glUseProgram(_shaderProgram);

_pointIndexAttr = glGetAttribLocation(_shaderProgram, "in_Position");
_sizeIndexAttr = glGetAttribLocation(_shaderProgram, "in_Size");
_colorIndexAttr = glGetAttribLocation(_shaderProgram, "in_Color");

In the draw function doesn’t work too… But just bind attribute work…

I find my problem… This was evident… -_-

In my shader I use attribute and uniforms:

  • uni bool useSingleColor
  • uni vec4 singleColor
  • attr vec4 ColorPerVertex

The problem is, in my cpp code, if I pass a true “useSingleColor” to my shader, I don’t bind the color buffer (as I’m not supposed to use it).

But this is not possible!

While you use attributes in your shader, this attributes NEED to have the same element size. Even if you don’t use it.

(I should write it on my wall: “If you use attributes, buffer elements need to have the same size!”)

This doesn’t make my program crash but it should actually.

But this is a waste of bandwidth… If you only use one color, you need to send a whole buffer of attribute of a single color…

Maybe there is a way to avoid this?

Anyone?

Thanks in advance!