Beginning problem with Shader


i have a Problem to understand this minnimal shader:

#version 400
“in vec3 vp;”
“void main() {”
" gl_Position = vec4(vp, 1.0);”

How does OpenGL know to put the vertex into the vp. variable.
I could write womething linke:

#version 400
“in vec3 vp;”
“in vec3 vp1;”
“in vec3 vp2;”
“void main() {”
" gl_Position = vec4(vp2, 1.0);”

so for me it is not celar how OpenGL passes a vertex to the shader…

Best Regards,

I am no where near a GLSL expert, but my vertex shader is shown below. The “layout (location =) in” statements define a vertex. Remember that the vertex shader is nothing but a vertex processor; its primary job is to process vertices.

Also realize that a vertex is a piece of data that’s definition is largely up to you as the programmer. Vertices always have positions (at least I’ve never seen a case where they do not), but pretty much everything else about them is up to how you want to define them. I defined mine as having positions, UV coordinates, normals, and colors. What you put in them and in what order is basically up to you. You just have to get the shader and the OGL code to agree.

#version 450 core
layout (location = 0) in vec3 Pos;
layout (location = 1) in vec2 UV;
layout (location = 2) in vec3 Normal;
layout (location = 3) in vec4 Color;

uniform mat4 WorldMatrix;
uniform mat4 ViewMatrix;
uniform mat4 ProjectionMatrix;

smooth out vec2 TextureCoordinates;
smooth out vec3 VertexNormal;
smooth out vec4 RGBAColor;
smooth out vec4 PositionRelativeToCamera;
out vec3 WorldSpacePosition;

void main()
	gl_Position = WorldMatrix * vec4(Pos, 1.0f);				//Apply object's world matrix.
	WorldSpacePosition =;						//Save the position of the vertex in the 3D world just calculated. Convert to vec3 because it will be used with other vec3's.
	gl_Position = ViewMatrix * gl_Position;						//Apply the view matrix for the camera.
	PositionRelativeToCamera = gl_Position;
	gl_Position = ProjectionMatrix * gl_Position;				//Apply the Projection Matrix to project it on to a 2D plane.
	TextureCoordinates = UV;									//Pass through the texture coordinates to the fragment shader.
	VertexNormal = mat3(WorldMatrix) * Normal;					//Rotate the normal according to how the model is oriented in the 3D world.
	RGBAColor = Color;											//Pass through the color to the fragment shader.

As to how OGL passes the vertices to the shader, it is by defining a vertex buffer and filling it. You can optionally have an index buffer that tells it what order to send the vertices in. Here’s my OGL code to define the buffers and the vertices:

bool HandCodedObjectClass::DefineMesh(int NumberOfVertices, GLfloat* VertexList, int NumberOfIndices, GLuint* IndexList, Texture2DClass* ColorMap)
	Texture = ColorMap;
	VerticesInMesh = NumberOfVertices;
	IndicesInMesh = NumberOfIndices;

	glGenBuffers(1, &vbo);
	glBindBuffer(GL_ARRAY_BUFFER, vbo);
	glBufferData(GL_ARRAY_BUFFER, NumberOfVertices * sizeof(GLfloat) * 12, VertexList, GL_STATIC_DRAW);

	glGenVertexArrays(1, &vao);

	glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, sizeof(GLfloat) * 12, nullptr);	//Vertex Position.

	glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, sizeof(GLfloat) * 12, (GLvoid*)(sizeof(GLfloat) * 3));	//Vertex UV.

	glVertexAttribPointer(2, 3, GL_FLOAT, GL_FALSE, sizeof(GLfloat) * 12, (GLvoid*)(sizeof(GLfloat) * 5));	//Vertex Normal.

	glVertexAttribPointer(3, 4, GL_FLOAT, GL_FALSE, sizeof(GLfloat) * 12, (GLvoid*)(sizeof(GLfloat) * 8));	//Vertex RGBA color.

	glGenBuffers(1, &ibo);
	glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ibo);
	glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(GLuint)*NumberOfIndices, IndexList, GL_STATIC_DRAW);

	return true;

Once you have your buffers loaded with the vertices, you set your uniforms to pass in the non-vertex data and then draw.

	glDrawElements(GL_TRIANGLES, IndicesInMesh, GL_UNSIGNED_INT, nullptr);
	glBindVertexArray(0);			//Release it.

[QUOTE=TinTin82;1286535]How does OpenGL know to put the vertex into the vp. variable.
I could write womething linke:

#version 400
“in vec3 vp;”
“in vec3 vp1;”
“in vec3 vp2;”
“void main() {”
" gl_Position = vec4(vp2, 1.0);”

so for me it is not celar how OpenGL passes a vertex to the shader…[/QUOTE]

a “generic vertex attribute” would be “vp”, “vp1”, and “vp2”
when the vertexshader is attached to a program object, and when that proram object gets linked by openGL, each “generic vertex attribute” gets an internal location (consider it as a kind of “array index”)

later, when you call “glDrawArrays(…)” or another draw command, the currently bound “vertexarray object” (VAO) sends data from a buffer object to the currently used program. even if you didnt use a VAO, openGL contains a “default VAO” which is “0”.

how does the VAO know what data from what “buffer object” it has to pull the data ???

to make the current VAO (lets assume you dont have any, so its 0) pull data for “vp1” from “myvertexbuffer”, you call:

glBindBuffer(GL_ARRAY_BUFFER, myvertexbuffer);
glVertexAttribPointer(location, ...);

to assign that “location” to the program, you’d call:

glBindAttribLocation(myprogram, location, "vp1");

“location” is an integer up to GL_MAX_VERTEX_ATTRIBS

to skip that call, you can explicitly set the “location” by writing the vertexshader like BBeck1 showed above:

layout (location = 0) in vec3 vp1;

thank you for much reply. It helped me verry much by getting the vertexshader working.
But now there is a second problem with the perspective transformation. If i applay a frustum matrix i can not see any picture but the backgroundcolor.

My drawing function is:

    Qvec3_KameraPosition = QVector3D(0.0f,1.0f,0.0f);
    Qvec3_KameraSpot = QVector3D(0.0f,0.0f,0.0f);
    Qvec3_KameraUp = QVector3D(0.0f,1.0f,0.0f);

    // Combine matrix for kamera position
    Qmat4_ViewTransformation = QMatrix4x4();

    Qmat4_ProjectionTransformation =QMatrix4x4();
    Qmat4_ProjectionTransformation.frustum(-0.5f,0.5f,-0.5f,0.5f,0.1f,10.0f); // If i make this, i can see no picture
    // Here some code is missing. glGetUniformLocation is done allready
    pGLFunc_4_3Core->glUniformMatrix4fv(gli_ProjectionTransformation_NameId,1,GL_FALSE,(GLfloat *)Qmat4_ProjectionTransformation.transposed().data());
    pGLFunc_4_3Core->glUniformMatrix4fv(gli_ViewTransformation_NameId,1,GL_FALSE,(GLfloat *)Qmat4_ViewTransformation.transposed().data());

	// X - Axis red
	//glColor3f(1.0f, 0.0f, 0.0f);
	glVertex3f(0.0f, 0.0f, 0.0f);
	glVertex3f(1.0f, 0.0f, 0.0f);
	// Y - Axis green
	//glColor3f(0.0f, 1.0f, 0.0f);
	glVertex3f(0.0f, 0.0f, 0.0f);
	glVertex3f(0.0f, 1.0f, 0.0f);
	// Z - Axis blue
	//glColor3f(0.0f, 0.0f, 1.0f);
	glVertex3f(0.0f, 0.0f, 0.0f);
	glVertex3f(0.0f, 0.0f, 1.0f);

My resize Event:

void SWC_MainWindow_SubWin_CandleStick_t::resizeGL(int w, int h)
        int side = (w < h)?w:h;
        glViewport( (w-side)/2,(h-side)/2,side,side );


And my vertex shader:

#version 430

in vec3 vp;

uniform mat4 mat4_ViewTransformation;
uniform mat4 mat4_ProjectionTransformation;

void main() 

  gl_Position = mat4_ProjectionTransformation * mat4_ViewTransformation * vec4(vp, 1.0);

If i do not apply the frustum i can see the three lines, Why doe the Perspective not work ??

Okay, i have it.

The near plane value of the frustum is a absolute value and not a radian from the eye point.

Now i will make the vertex buffer stuff =)

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.