Transform data received on shader looks different than data sent

I am getting an issue where the transform data being sent to the shader for use in Instanced Draw calls looks different than the data that is actually being received.

I am sending an array of floats.

The number of floats in the array is a multiple of 16 (every 16 floats represents a transform).

Right now I am just sending a bunch of identity transforms but when I view the array on render doc, the set of 4 vectors that each instance of the draw call is using does not match expectations.

This is the data each instance is using in render doc

I am expecting the data to be like this:

1 0 0 0
0 1 0 0
0 0 1 0
0 0 0 1

But the first 16 floats looks like this in the render doc (each row contains 16 floats, only 11 are shown here:

1 0 0 0
0 0 0 0
0 0 0 1
0 0 1 0

This is the code I use to allocate the data to the shader. (This is a qt creator c++ project. Qt has their own opengl function bindings)

        m_transformbo.allocate(, m_listOfTransforms.size() * sizeof(float));

        if (m_listOfTransforms.size() > 0)
            for (int i = 0; i < 4; i++) {
                m_view->shaderProgram()->enableAttributeArray(m_shader_locs.transforms + i);
                int shaderLocation = m_shader_locs.transforms + i;
                int offset = i * sizeof(float);
                int stride = 16 * sizeof(float);
                int tupleSize = 4;

                m_view->shaderProgram()->setAttributeBuffer(shaderLocation, GL_FLOAT, offset, tupleSize,stride);
                m_view->glVertexAttribDivisor(shaderLocation, 1);

Here’s the function signature for setAttributeBuffer:

I would post the link but I can’t link links yet:

void QOpenGLShaderProgram::setAttributeBuffer(const char *name, GLenum type, int offset, int tupleSize, int stride = 0)

This is an overloaded function.

Sets an array of vertex values on the attribute called name in this shader program, starting at a specific offset in the currently bound vertex buffer. The stride indicates the number of bytes between vertices. A default stride value of zero indicates that the vertices are densely packed in the value array.

The type indicates the type of elements in the vertex value array, usually GL_FLOAT, GL_UNSIGNED_BYTE, etc. The tupleSize indicates the number of components per vertex: 1, 2, 3, or 4.

The array will become active when enableAttributeArray is called on the name. Otherwise the value specified with setAttributeValue for name will be used.

See also setAttributeArray.

Then try posting to a Qt forum. There’s no reason to assume that anyone here would understand that code. You didn’t even include the declarations, so it’s not like people can even look up the types in the Qt documentation.

Understood. I made a similar post on a QT post. I also added the only relevant function definition.

I haven’t seen all of your code, but it looks very much like you’ve fallen victim to one of the worst aspects of what Qt laughably calls its “OpenGL API”. Specifically, making enableAttributeArray and the other attribute functions member of the shader program class despite them having absolutely no effect on shader program objects.

Those functions do not set state into a shader program. They set state into an OpenGL vertex array object. Which Qt technically has, but they don’t have those member functions.

You need to bind a VAO before calling these functions when setting up your rendering. Then you need to bind that VAO later on when it comes time to render that object.

I figured it out, my

int offset = i * sizeof(float);

should have been

int offset = i * sizeof(float) * 4;

This makes sense since the offset should be every 4 floats