Point Light implementation with GLSL - Code Included

I am trying to setup a basic pink point light at (0, 2, 0) to illuminate my textured plane (quad). I currently have two problems:

  1. The point light seems to follow the camera? … I tried to transform the Position by my projection matrix but that didnt seem to help.
  2. The point light only illuminates half of my quad, it doesn’t look like a smooth fading out? … I tried adjusting my Attenuation parameters but its very precarious. Either no effect, or every thing is bright pink.

Initial Camera View @ (0,2,0)
[ATTACH=CONFIG]815[/ATTACH]

Angled Camera Down and Zoomed Out
[ATTACH=CONFIG]814[/ATTACH]

Edit:
It seems for shallow camera angles the point light is working as intended, but for other angles it is completely messed up like shown above.

[ATTACH=CONFIG]816[/ATTACH]

Edit2:
Figured out my own problem. Tried to transform my normals, but did it wrong first time. Use this link: http://www.songho.ca/opengl/gl_normaltransform.html


transformed_normal = transpose(inverse(mat3(projection))) * normalize(normal);

Did you copy some of the code from tutorials? Typically lighting computations are done in camera relative space.

Why does the fragment shader need an interpolated projection matrix? If you want to use the projection matrix in the fragment shader, you can use the uniform.
I don’t see a reason to use it and you aren’t going to have a different projection matrix for each vertex that you need to interpolate?

You are assigning the projection matrix to the out qualifier as already explained above. Where do you use the “transform” matrix you specified as uniform above?
Are your input vertex positions already camera relative? Does the “projection” matrix contain the product of modelview and projection matrix?

For some reason, I keep seeing things like this a lot lately. If diffuseFactor is zero (the case where you skip the computation), the result is zero
anyway, as it gets multiplied with diffuseFactor. This branch is a pointless, you are not getting any performance gains out of here, especially if
the compiler keeps the branch. Branching on the GPU is expensive.

So. You transform the light position with the projection matrix that you interpolate across the triangle for some reason instead of using the uniform.
If your “projection” matrix contains a camera transform and a projection matrix, you transform the light into clip space and use clip space fragment
positions for your light computation.

Could you try seperating camera and projective transformations and doing the lighting computations in view space instead of clip space? Instead of
interpolating camera and perspective transformations accross the triangles, you might want to use the uniform values, or perhaps move the transformation
into the vertexshader alltogehter and interpolating the resulting vector?

Thank you for the response.

Yes. I am following a recent Youtube series: https://www.youtube.com/watch?v=ss3AnSxJ2X8

For some reason, I keep seeing things like this a lot lately. If diffuseFactor is zero (the case where you skip the computation), the result is zero
anyway, as it gets multiplied with diffuseFactor. This branch is a pointless, you are not getting any performance gains out of here, especially if
the compiler keeps the branch. Branching on the GPU is expensive.

Great point! I have corrected this in my code.

Why does the fragment shader need an interpolated projection matrix?

It doesn’t. I couldn’t figure out my problem and I am new to shader language so I was trying different things. Its a remnant.

Where do you use the “transform” matrix you specified as uniform above?

I wasn’t using it anymore, same as above. GLSL will optimize it out anyways but I removed it.

I have another problem with my point light. It looked like it was working with a single plane, but with a cube its not working. The back and front faces aren’t being lit. See picture.

[ATTACH=CONFIG]818[/ATTACH]

Lightning Calculations


vec3 calcDirectLight(float intensity, vec3 color, vec3 direction, vec3 normal)
{
	float diffuseFactor = max(dot(normal, -direction), 0);

	return color * intensity * diffuseFactor;
}

vec3 calcPointLight(float intensity, vec3 color, vec3 position, float constant, float linear, float exponent, vec3 normal)
{;
	vec3 lightDir = vertex_position - position;

 	float lightDist = length(lightDir);

	lightDir = lightDir / lightDist;

	float attenuation = 1.0f / (0.1f + constant + linear * lightDist + exponent * lightDist * lightDist);

	vec3 diffuse = calcDirectLight(intensity, color, lightDir, normal);

	return diffuse * attenuation;
}

Cube Implementation


	float halfWidth = 50.0f;
	float halfLength = 50.0f;

	float height = 100.0f;

	float x_lo = -halfWidth;
	float x_hi = halfWidth;
	float z_lo = -halfLength;
	float z_hi = halfLength;
	float y_lo = 0;
	float y_hi = height;

	Vector3f pColor = Vector3f(1, 1, 1);

	//Bottom Plane
	Plane plane1 = Plane(
		Vertex(Vector3f(x_lo, y_lo, z_lo), pColor, Vector3f(0, 1, 0), Vector2f(0, 0)),
		Vertex(Vector3f(x_lo, y_lo, z_hi), pColor, Vector3f(0, 1, 0), Vector2f(1, 0)),
		Vertex(Vector3f(x_hi, y_lo, z_hi), pColor, Vector3f(0, 1, 0), Vector2f(1, 1)),
		Vertex(Vector3f(x_hi, y_lo, z_lo), pColor, Vector3f(0, 1, 0), Vector2f(0, 1)),
		false,
		false
		);

	//Top Plane
	Plane plane2 = Plane(
		Vertex(Vector3f(x_lo, y_hi, z_lo), pColor, Vector3f(0, -1, 0), Vector2f(0, 0)),
		Vertex(Vector3f(x_lo, y_hi, z_hi), pColor, Vector3f(0, -1, 0), Vector2f(1, 0)),
		Vertex(Vector3f(x_hi, y_hi, z_hi), pColor, Vector3f(0, -1, 0), Vector2f(1, 1)),
		Vertex(Vector3f(x_hi, y_hi, z_lo), pColor, Vector3f(0, -1, 0), Vector2f(0, 1)),
		false,
		true
		);

	//Left Plane
	Plane plane3 = Plane(
		Vertex(Vector3f(x_lo, y_hi, z_hi), pColor, Vector3f(1, 0, 0), Vector2f(0, 0)),
		Vertex(Vector3f(x_lo, y_lo, z_hi), pColor, Vector3f(1, 0, 0), Vector2f(1, 0)),
		Vertex(Vector3f(x_lo, y_lo, z_lo), pColor, Vector3f(1, 0, 0), Vector2f(1, 1)),
		Vertex(Vector3f(x_lo, y_hi, z_lo), pColor, Vector3f(1, 0, 0), Vector2f(0, 1)),
		false,
		false
		);

	//Right Plane
	Plane plane4 = Plane(
		Vertex(Vector3f(x_hi, y_hi, z_hi), pColor, Vector3f(-1, 0, 0), Vector2f(0, 0)),
		Vertex(Vector3f(x_hi, y_lo, z_hi), pColor, Vector3f(-1, 0, 0), Vector2f(1, 0)),
		Vertex(Vector3f(x_hi, y_lo, z_lo), pColor, Vector3f(-1, 0, 0), Vector2f(1, 1)),
		Vertex(Vector3f(x_hi, y_hi, z_lo), pColor, Vector3f(-1, 0, 0), Vector2f(0, 1)),
		false,
		true
		);

	//Back Plane
	Plane plane5 = Plane(
		Vertex(Vector3f(x_hi, y_hi, z_lo), pColor, Vector3f(0, 0, 1), Vector2f(0, 0)),
		Vertex(Vector3f(x_lo, y_hi, z_lo), pColor, Vector3f(0, 0, 1), Vector2f(1, 0)),
		Vertex(Vector3f(x_lo, y_lo, z_lo), pColor, Vector3f(0, 0, 1), Vector2f(1, 1)),
		Vertex(Vector3f(x_hi, y_lo, z_lo), pColor, Vector3f(0, 0, 1), Vector2f(0, 1)),
		false,
		false
		);

	//Front Plane
	Plane plane6 = Plane(
		Vertex(Vector3f(x_hi, y_hi, z_hi), pColor, Vector3f(0, 0, -1), Vector2f(0, 0)),
		Vertex(Vector3f(x_lo, y_hi, z_hi), pColor, Vector3f(0, 0, -1), Vector2f(1, 0)),
		Vertex(Vector3f(x_lo, y_lo, z_hi), pColor, Vector3f(0, 0, -1), Vector2f(1, 1)),
		Vertex(Vector3f(x_hi, y_lo, z_hi), pColor, Vector3f(0, 0, -1), Vector2f(0, 1)),
		false,
		true
		);

	plane1.Draw();
	plane2.Draw();
	plane3.Draw();
	plane4.Draw();
	plane5.Draw();
	plane6.Draw();

If I change the normal vector of the front/back faces, to some like (1,0,0) or (0,1,0) it lights the plane. So the lightning is working. Just something going on with normals along Z…??

Still trying debug this. It looks like my Z-values are broken for normals. Below is a picture of the cube colored by the normal values.


frag_color = texture_color * abs(vec4(normal, 1.0));

The back plane should be a blue textured plane… buts it black. Somehow the z-values are lost.

[ATTACH=CONFIG]819[/ATTACH]

Here is how I am setting up the constructors:


//Back Plane
Plane plane5 = Plane(
	Vertex(Vector3f(x_hi, y_hi, z_lo), pColor, Vector3f(0, 0, -1), Vector2f(0, 0)),
	Vertex(Vector3f(x_lo, y_hi, z_lo), pColor, Vector3f(0, 0, -1), Vector2f(1, 0)),
	Vertex(Vector3f(x_lo, y_lo, z_lo), pColor, Vector3f(0, 0, -1), Vector2f(1, 1)),
	Vertex(Vector3f(x_hi, y_lo, z_lo), pColor, Vector3f(0, 0, -1), Vector2f(0, 1)),
	false,
	false
	);

...

void Plane::addVertices(vec_vertex vertices)
{
	if (vbo)
	{
		//Bind
		glBindBuffer(GL_ARRAY_BUFFER, vbo);

		//Load Vertices
		glBufferData(GL_ARRAY_BUFFER, sizeof(Vertex)* vertices.size(), &vertices[0], GL_STATIC_DRAW);

		glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid*)0);
		glEnableVertexAttribArray(0);
		glVertexAttribPointer(1, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid*)sizeof(Vector3f));
		glEnableVertexAttribArray(1);
		glVertexAttribPointer(2, 2, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid*)(sizeof(Vector3f)+sizeof(Vector3f)));
		glEnableVertexAttribArray(2);
		glVertexAttribPointer(3, 2, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid*)(sizeof(Vector3f)+sizeof(Vector3f)+sizeof(Vector3f)));
		glEnableVertexAttribArray(3);
	}
}

...

Vertex::Vertex(Vector3f position, Vector3f color, Vector3f normal, Vector2f uvcoords)
{
	Position = Vector3f(position.getX(), position.getY(), position.getZ());
	Color = Vector3f(color.getX(), color.getY(), color.getZ());
	Normal = Vector3f(normal.getX(), normal.getY(), normal.getZ());
	UVCoords = Vector2f(uvcoords.getX(), uvcoords.getY());
}

...

Vector3f::Vector3f(float fx, float fy, float fz)
{
	setX(fx);
	setY(fy);
	setZ(fz);
}

...

float Vector3f::getX() { return x; }
float Vector3f::getY() { return y; }
float Vector3f::getZ() { return z; }

void Vector3f::setX(float f) { x = f; }
void Vector3f::setY(float f) { y = f; }
void Vector3f::setZ(float f) { z = f; }

...


Still have no idea whats going on. What else could it be? Help plz :smiley:

Are you sure that attribute 2 should only have 2 components, not 3?

Thank you!! I forgot to change the line when I added the normals to my VAO. So silly :wink:

Corrected Code


glVertexAttribPointer(2, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid*)(sizeof(Vector3f)+sizeof(Vector3f)));

But does it affect the result?

Passing 2 instead of 3 for that parameter would certainly explain why the back face was black rather than blue: you’re ignoring the Z component and using 0 instead (if you don’t supply all 4 components for an attribute, y and z default to 0 and w defaults to 1), so the value passed to the vertex shader is (0,0,0,1).

Yes. All works now. Sending in PointLight arrays now. Much progress :smiley: