Phong highlight distorted on simple quad

I’ve implemented per-pixel phong shading in GLSL. On curved surfaces, like the stanford bunny,
the results look fine to me; however, when applying the same shading to a plane ( geometry: just 2 triangles… ),
a weird distortion of the specular highlight appears.

http://fs1.directupload.net/images/141124/rnb7rund.png

The highlight seems to be broken up in two parts along the diagonal.

I’ve had the fragment shader output normals, view vector & light vector as fragment color,
and they looked fine to me - no obvious discontiniuties or anything.

Is there an explanation for this? ( Other than me doing something wrong in the shader :wink: )
It reminds me of the effect you get when UV coordinates are not perspective corrected.
However, I used the smooth interpolation qualifier for all attributes, so that can’t be the reason.
Also, the distortion effect starts to disappear when I tessellate the plane. It’s less obvious when
I use 8 triangles instead of 2, and more or less gone when I use 32 triangles.

Maybe you are not propperly clamping a value somewhere. Could you please post your shader source?

vertex shader:

#version 330 core

layout ( location = 0 ) in vec3 vertexPosition;
layout ( location = 1 ) in vec3 vertexNormal;

out vec3 toLight;
out vec3 toEye;
out vec3 normal;

uniform mat4 projection;
uniform mat4 view;
uniform mat4 model;

void main()
{
	vec4 vertexPosition_eye = view * model * vec4( vertexPosition, 1.0 );
	vec4 lightPosition_eye = view * vec4( 0.0, 1.0, 0.0, 1.0 );

	toLight = normalize( lightPosition_eye.xyz - vertexPosition_eye.xyz );
	toEye = normalize( -vertexPosition_eye.xyz );
	
	mat3 normalTf = inverse( transpose( mat3( view * model ) ) );
	normal = normalize( normalTf * vertexNormal );

	gl_Position = projection * vertexPosition_eye;
}

fragment shader:

#version 330 core

layout ( location = 0 ) out vec4 fragColor;

in vec3 normal;
in vec3 toLight;
in vec3 toEye;

void main()
{
	vec3 N = normalize( normal );
	vec3 L = normalize( toLight );
	vec3 V = normalize( toEye );

	float dotNL = max( dot( N, L ), 0.0 );

	float phongTerm = 0.0;
	if ( dotNL > 0.0 )
	{
		vec3 R = reflect( -L, N );
		phongTerm = pow( max( dot( R, V ), 0.0 ), 100.0 );
	}

	vec3 specularColor = phongTerm * vec3( 0.66, 0.0, 0.33 );
	vec3 diffuseColor = dotNL * vec3( 0.33, 0.0, 0.11 );

	fragColor = vec4( clamp( diffuseColor + specularColor, vec3( 0.0 ), vec3( 1.0 ) ), 1.0 );
}

I’ve hardcoded all uniforms ( light color, light position, diffuse / specular absorption coefficients, shininess ) except the transfomation matrices.

Both the vertex and fragment shader look ok so far.

On a side note, why do you have a branch in your fragment shader? (Branches are expensive and it doesn’t seem neccessary)

There must be some discontinuity along the edge. How do you generate the quad geometry? You said you tried to output normal,
view and light vectors. What about the reflection vector? Have you tried normalizing the reflection vector? Judging from the
formula used to compute it, I’m not quite sure if the output is normalized even if the input is.

[QUOTE=Agent D;1262666]What about the reflection vector? Have you tried normalizing the reflection vector? Judging from the
formula used to compute it, I’m not quite sure if the output is normalized even if the input is.[/QUOTE]
Provided that the normal vector has unit length, the reflected vector has the same magnitude as the incidence vector.

Well I have tried a couple of different ways for creating the geometry; so far nothing changed the weird distortion thingie.

Currently, I use an extremely simple setup: a single vbo, with 4 positions, which are rendered a triangle strip. I hardcoded the vertex normal in the shader, so there is only one attribute left.

initialization:


const GLfloat quad[] = { -1.f, 0.f, -1.f, /**/ 1.f, 0.f, -1.f, /**/ -1.f, 0.f, 1.f, /**/ 1.f, 0.f, 1.f };

GLuint bufferHandle = 0;
glGenBuffers( 1, &bufferHandle );
glBindBuffer( GL_ARRAY_BUFFER, bufferHandle );
glBufferData( GL_ARRAY_BUFFER, sizeof( quad ), reinterpret_cast< GLvoid const* >( quad ), GL_STATIC_DRAW );
glBindBuffer( GL_ARRAY_BUFFER, 0 );

rendering:


glEnableVertexAttribArray( 0 );

glBindBuffer( GL_ARRAY_BUFFER, bufferHandle );
glVertexAttribPointer( 0, 3, GL_FLOAT, GL_FALSE, 3 * sizeof( float ), nullptr );
glBindBuffer( GL_ARRAY_BUFFER, 0 );

glDrawArrays( GL_TRIANGLE_STRIP, 0, 4 );

here’s the full cpp file, containing a ‘minimal sample’: https://www.dropbox.com/s/lo2admvuexz4rpm/main.cpp?dl=0

Try normalizing the fragement shader inputs in the fragment shader, rather than the vertex shader. Perhaps you’re running into interpolation errors that are being exaggerated by the phong pow() function. Because the two triangles are separate interpolations, you’re seeing the difference in error where they meet.

thanks a lot :slight_smile: this does indeed fix the problem.
This is funny, when I learned about common shading models, I was told that normalizing all shading relevant vectors both in the vertex
and in the fragment shader was the way to go, because not doing it ‘would cause shading artifacts’. So I guess this is true only for the
normals ( if the model matrix contains a scaling ) but not for the others.