Int vertex doesn't work

hi,

i try to use ivec with a orthogonal projection.

here is the vertex shader:


// Vertex shader

#version 330

in ivec2 in_Vertex;
in vec2 in_TexCoord0;

uniform mat4 projection;

out vec2 texCoord0;

void main()
{
    gl_Position = projection * vec4( in_Vertex, 0, 1 );
    texCoord0   = in_TexCoord0;
}

nothing is draw onto the screen but when i use float vertex everything is working well.
I call glVertexAttribIPointer with #GL_INT to pass the data to the shader.

edit: fixed, i am on a x64 system where int are 8 bytes but the shader expect 4 bytes so i must use long or int32

What 64-bit system has 8-byte ints, but 4-byte longs? I thought the C specification required long to be at least as large as int?

And that’s why you should use the GL-types: GLint, GLuint, etc.

i am not coding with C but with another language where there no typedef.

That language should still have some means of accessing OpenGL’s types. Otherwise you can’t properly implement OpenGL in that language.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.