Odd reflection problem occurs yet again!

Annnnnd I’m back with yet the same odd reflection problem which I had few days ago. :tired:
Recently I tried running my game on a windows operating machine and everything was invisible. When I turn off reflection, it renders everything correctly but with no reflection. I’ve been told not to use texture2D() in my fragment shader as it’s deprecated and fair enough that did solve the problem on my sister’s machine but the problem still persisted on several other machines and I don’t know why. This is getting ridiculous…:confused:

// Vertex Shader .vs


#version 430 core

// Input
layout (location = 0) in vec3 Vertex_Position;
layout (location = 1) in vec2 Vertex_Texture;
layout (location = 2) in vec4 Vertex_Colour;
layout (location = 3) in vec3 Vertex_Normal;

// Output 
out vec4 outputColors;
out vec2 TexCoord;
out vec3 normals;
out vec3 reflectedVector;

// Uniforms
uniform mat4 TransformationMatrix;
uniform vec3 cameraPos;

void main()
{
	vec4 worldPosition = TransformationMatrix * vec4(Vertex_Position, 1.0f);
	gl_Position = TransformationMatrix * vec4(Vertex_Position, 1.0f);
	
	TexCoord = Vertex_Texture;
	outputColors = Vertex_Colour;
	normals = (TransformationMatrix * vec4(Vertex_Normal, 0.0f)).xyz;
	
	vec3 viewVector = normalize(worldPosition.xyz - cameraPos);
	reflectedVector = reflect(viewVector, normals);
} 

// Fragment Shader .fs


#version 430 core

// Inputs
in vec4 outputColors;
in vec2 TexCoord;
in vec3 normals;
in vec3 reflectedVector;

// Output Color
out vec4 VertexColors;

// Uniforms
uniform sampler2D Texture1;
uniform samplerCube enviroMap;
uniform vec3 SpotlightPlayerPos;

// Variables
float ambient = 1.0f;

void main()
{
	// lambertian lighting (Phong - Per Pixel Shading)
	float brightness = max(dot(-vec3(0.0f, 0.0f, 1.0f), normals), 0.0) + ambient;
	VertexColors = texture(Texture1, TexCoord) * clamp(dot(-vec3(0.0f, 0.0f, 1.0f), normals), 0.0, 1.0) * brightness;
	
	// Reflection
	vec4 reflectedColor = texture(enviroMap, reflectedVector);
	VertexColors = mix(VertexColors, reflectedColor, 0.3); 
}

Cubemap vanishes if VertexColors value is 1.0 instead of 0.3 and reflection would work. However, if I turn down the mix to 0.3, cubemap would appear but everything else would be invisible (on certain machines - not all).

I can’t see anything wrong with my shader programs anymore. :frowning:

Have you tried the other suggestion I made last time, i.e. to print out the logs from compilation and linking?

Hey GClements, here’s my shader.cpp:


#include "Shader.h"
#include <vector>

using std::getline;
using std::cout;

std::string Shader::LoaderShaderFile(char* filename)
{
	std::string shaderCode;
	std::ifstream file(filename, std::ios::in);

	if (!file.good())
	{
		std::cout << "ERROR: Unable to read file: " << filename << std::endl;
		std::terminate();
	}

	file.seekg(0, std::ios::end);
	shaderCode.resize((unsigned int)file.tellg());
	file.seekg(0, std::ios::beg);
	file.read(&shaderCode[0], shaderCode.size());

	file.close();
	return shaderCode;
}

GLuint Shader::CreateShader(GLenum shaderType, std::string source, char* shaderName)
{
	int compile_result = 0;

	GLuint shader = glCreateShader(shaderType);
	const char* shader_code_ptr = source.c_str();
	const int shader_code_size = source.size();

	glShaderSource(shader, 1, &shader_code_ptr, &shader_code_size);
	glCompileShader(shader);
	glGetShaderiv(shader, GL_COMPILE_STATUS, &compile_result);

	//check for errors
	if (compile_result == GL_FALSE)
	{
		int info_log_length = 0;
		glGetShaderiv(shader, GL_INFO_LOG_LENGTH, &info_log_length);
		std::vector<char> shader_log(info_log_length);
		glGetShaderInfoLog(shader, info_log_length, NULL, &shader_log[0]);
		std::cout << "ERROR compiling shader: " << shaderName << std::endl << &shader_log[0] << std::endl;
		return 0;
	}

	return shader;
}

void Shader::UpdateSkyboxTexture(GLuint& SkyboxID)
{
	glDepthMask(GL_FALSE);
	glUniform1i(glGetUniformLocation(m_program, "Skybox"), 0);
	glBindTexture(GL_TEXTURE_CUBE_MAP, SkyboxID);
	glDepthMask(GL_TRUE);
}

// Get uniform 
void Shader::UpdateTransform(const MatrixTransform& transformation, const Camera& camera)
{
	m_uniforms[TRANSFORM_U] = glGetUniformLocation(m_program, "TransformationMatrix");
	glm::mat4 ModelMatrix = camera.GetWorldToViewMatrix() * transformation.GetModel();
	glUniformMatrix4fv(m_uniforms[TRANSFORM_U], 1, GL_FALSE, &ModelMatrix[0][0]);
}

GLuint Shader::CreateProgram(char* vertexShaderFilename, char* fragmentShaderFilename)
{
	//read the shader files and save the code
	std::string vertex_shader_code = LoaderShaderFile(vertexShaderFilename);
	std::string fragment_shader_code = LoaderShaderFile(fragmentShaderFilename);

	GLuint vertex_shader = CreateShader(GL_VERTEX_SHADER, vertex_shader_code, "vertex shader");
	GLuint fragment_shader = CreateShader(GL_FRAGMENT_SHADER, fragment_shader_code, "fragment shader");

	int link_result = 0;

	//create the program handle, attatch the shaders and link it
	GLuint program = glCreateProgram();
	glAttachShader(program, vertex_shader);
	glAttachShader(program, fragment_shader);

	glLinkProgram(program);
	glGetProgramiv(program, GL_LINK_STATUS, &link_result);
	//check for link errors
	if (link_result == GL_FALSE)
	{
		int info_log_length = 0;
		glGetProgramiv(program, GL_INFO_LOG_LENGTH, &info_log_length);
		std::vector<char> program_log(info_log_length);
		glGetProgramInfoLog(program, info_log_length, NULL, &program_log[0]);
		std::cout << "ERROR: Linking Operation Failed " << std::endl << &program_log[0] << std::endl;
		return 0;
	}
	return program;
}

Shaders do not give me any compilation or linker errors.

No GL errors either ?

just as a suggestion:
i dont know how you create your GL context, but if you request a “core profile” context, then everything that is deprecated will give you an error. if you code with / for a core profile, you can be (relatively) sure that it will work later on other machines, if necessary, check the context right after creation for several things, like:
– gl version (major AND minor)
– core profile
– etc

example: the commonly used GL_QUADS is also deprecated in GL 4.5

https://www.khronos.org/opengl/wiki/Get_Context_Info

as an example: right AFTER you initialized GLEW, GLAD or whatever you use to get the GL functions, call:

GLContextInfo infos = GetContextInfos();
if (infos.Flags.IsForwardCompatibleContext)
cout << "GL context is NOT core profile !!!" << endl;

EDIT:
just checked it, use glGetIntegerv to get the GL_CONTEXT_PROFILE_MASK value, and then:
bool iscoreprofile = value & GL_CONTEXT_CORE_PROFILE_BIT;

beside that, you can query “shader info logs” and “program info logs” regardless if an error occured:


std::string ShaderInfoLog(GLuint shader)
{
    if (glIsShader(shader))
    {
        GLint logsize = 0;
        GLchar infolog[1024] = { 0 };
        glGetShaderInfoLog(shader, 1024, &logsize, infolog);

        return std::string(infolog);
    }

    return "invalid shader";
}

std::string ProgramInfoLog(GLuint program)
{
    if (glIsProgram(program))
    {
        GLint logsize = 0;
        GLchar infolog[1024] = { 0 };
        glGetProgramInfoLog(program, 1024, &logsize, infolog);

        return std::string(infolog);
    }

    return "invalid program";
}

Thank you for your replies, sadly I still haven’t been able to find a solution to the problem although I’m getting close (I think). I am using glew and freeglut for my context and I looked into retrieving context information as John suggested and here’s what I managed to do:
main.cpp:


// Local Includes
#include "Game.h"

int main(int argc, char** argv)
{
	glutInit(&argc, argv);
	glutInitDisplayMode(GLUT_RGBA | GLUT_DOUBLE | GLUT_DEPTH);
	glutInitWindowPosition(200, 200);
	glutInitWindowSize(WIDTH, HEIGHT);
	glutCreateWindow("Robotron 3D");

	glewInit();
	
	// -------------------
	// DEBUGGING
	// -------------------
	GLint major = 0;
	GLint minor = 0;
	glGetIntegerv(GL_CONTEXT_PROFILE_MASK, &major); 
	glGetIntegerv(GL_CONTEXT_PROFILE_MASK, &minor);
	
	std::cout << "major: " << major << ", " << "minor: " << minor << "
"; // Output is 2, 2 (I do not fully understand what that means though)
	// -------------------

	//MainMenu();
	InitializeGameContents(); // Temp (remove later uncomment above line)

	// Register callbacks
	glutDisplayFunc(render);
	glutKeyboardFunc(KeyboardDown);
	glutKeyboardUpFunc(KeyboardUp);
	glutSpecialFunc(SpecialKey);
	glutSpecialUpFunc(SpecialKeyUp);

	glutTimerFunc(10, Update, 0);
	glutMainLoop();

	return 0;
}

Here is how my shader.cpp looks like at this moment:


// Local Includes
#include "Shader.h"

// Pre-processor directives
#include <vector>

// STD Elements
using std::getline;
using std::cout;

// GLuint shaderID = 0;

std::string ShaderTypeName(GLuint shader)
{
	if (glIsShader(shader))
	{
		GLint shaderType = 0;
		glGetShaderiv(shader, GL_SHADER_TYPE, &shaderType);

		if (shaderType == GL_VERTEX_SHADER)
			return "Vertex Shader";
		if (shaderType== GL_TESS_CONTROL_SHADER)
			return "Tessellation Control Shader";
		if (shaderType == GL_TESS_EVALUATION_SHADER)
			return "Tessellation Evaluation Shader";
		if (shaderType == GL_GEOMETRY_SHADER)
			return "Geometry Shader";
		if (shaderType == GL_FRAGMENT_SHADER)
			return "Fragment Shader";
		if (shaderType == GL_COMPUTE_SHADER)
			return "Compute Shader";
	}

	return "invalid shader";
}

std::string Shader::LoaderShaderFile(char* filename)
{
	std::string shaderCode;
	std::ifstream file(filename, std::ios::in);

	if (!file.good())
	{
		std::cout << "ERROR: Unable to read file: " << filename << std::endl;
		std::terminate();
	}

	file.seekg(0, std::ios::end);
	shaderCode.resize((unsigned int)file.tellg());
	file.seekg(0, std::ios::beg);
	file.read(&shaderCode[0], shaderCode.size());

	file.close();
	return shaderCode;
}

GLuint Shader::CreateShader(GLenum shaderType, std::string source, char* shaderName)
{
	int compile_result = 0;

	GLuint shader = glCreateShader(shaderType);
	const char* shader_code_ptr = source.c_str();
	const int shader_code_size = source.size();

	glShaderSource(shader, 1, &shader_code_ptr, &shader_code_size);
	glCompileShader(shader);
	glGetShaderiv(shader, GL_COMPILE_STATUS, &compile_result);

	//check for errors
	if (compile_result == GL_FALSE)
	{
		int info_log_length = 0;
		glGetShaderiv(shader, GL_INFO_LOG_LENGTH, &info_log_length);
		std::vector<char> shader_log(info_log_length);
		glGetShaderInfoLog(shader, info_log_length, NULL, &shader_log[0]);
		std::cout << "ERROR compiling shader: " << shaderName << std::endl << &shader_log[0] << std::endl;
		return 0;
	}

	return shader;
}

void Shader::UpdateSkyboxTexture(GLuint& SkyboxID)
{
	glDepthMask(GL_FALSE);
	glUniform1i(glGetUniformLocation(m_program, "Skybox"), 0);
	glBindTexture(GL_TEXTURE_CUBE_MAP, SkyboxID);
	glDepthMask(GL_TRUE);
}

// Get uniform 
void Shader::UpdateTransform(const MatrixTransform& transformation, const Camera& camera)
{
	m_uniforms[TRANSFORM_U] = glGetUniformLocation(m_program, "TransformationMatrix");
	glm::mat4 ModelMatrix = camera.GetWorldToViewMatrix() * transformation.GetModel();
	glUniformMatrix4fv(m_uniforms[TRANSFORM_U], 1, GL_FALSE, &ModelMatrix[0][0]);
}

std::string Shader::ShaderInfoLog(GLuint shader)
{
	if (glIsShader(shader))
	{
		GLint logsize = 0;
		GLchar infolog[1024] = { 0 };
		glGetShaderInfoLog(shader, 1024, &logsize, infolog);

		return std::string(infolog);
	}

	return "invalid shader";
}

std::string Shader::ProgramInfoLog(GLuint program)
{
	if (glIsProgram(program))
	{
		GLint logsize = 0;
		GLchar infolog[1024] = { 0 };
		glGetProgramInfoLog(program, 1024, &logsize, infolog);

		return std::string(infolog);
	}

	return "invalid program";
}

GLuint Shader::CreateProgram(char* vertexShaderFilename, char* fragmentShaderFilename)
{
	//read the shader files and save the code
	std::string vertex_shader_code = LoaderShaderFile(vertexShaderFilename);
	std::string fragment_shader_code = LoaderShaderFile(fragmentShaderFilename);

	GLuint vertex_shader = CreateShader(GL_VERTEX_SHADER, vertex_shader_code, "vertex shader");
	GLuint fragment_shader = CreateShader(GL_FRAGMENT_SHADER, fragment_shader_code, "fragment shader");

	int link_result = 0;

	//create the program handle, attatch the shaders and link it
	GLuint program = glCreateProgram();
	glAttachShader(program, vertex_shader);
	glAttachShader(program, fragment_shader);

	glLinkProgram(program);
	glGetProgramiv(program, GL_LINK_STATUS, &link_result);
	//check for link errors
	if (link_result == GL_FALSE)
	{
		int info_log_length = 0;
		glGetProgramiv(program, GL_INFO_LOG_LENGTH, &info_log_length);
		std::vector<char> program_log(info_log_length);
		glGetProgramInfoLog(program, info_log_length, NULL, &program_log[0]);
		std::cout << "ERROR: Linking Operation Failed " << std::endl << &program_log[0] << std::endl;
		return 0;
	}

	// ----------------------
	// DEBUGGING
	// ----------------------
	std::string CheckVertexShader = "", CheckFragShader = "", CheckType = "";
	CheckVertexShader = ShaderInfoLog(vertex_shader);
	CheckFragShader = ShaderInfoLog(fragment_shader);

	std::cout << CheckVertexShader << "
";
	std::cout << CheckFragShader << "
";

	CheckType = ShaderTypeName(fragment_shader);
	std::cout << CheckType << "
";
	// ------------------------------------------------

	return program;
}

So I haven’t been able to determine the cause of the problem yet but maybe I’d get something from the shader logs when I try to run the code on the other machine (the one that renders everything invisible), though I do not have access to it right now so I’d have to wait for tomorrow to be able to test that out. Hopefully I’ll be able to get this issue sorted out and if not then at least I tried. I’m still very thankful for your help :slight_smile:

Personally, I print the logs regardless of the compilation/linking status; warnings can be useful.

Also, some combinations of state will generate a GL error if you try to execute the program, so check for GL errors after the draw call.

Well after spending a whole week attempting to fix this nonsense, I think I have finally found out the root of the problem I was and still (technically speaking) having. Long story short, it turned out that the computers that were giving me the unexpected outcome had outdated graphics drivers which I believe is causing this problem to occur. I tried updating the drives but I needed admin rights which I did not have in that lab. However, I did try running my code on a slightly newer computers with updated graphics drivers and everything ran just fine. And while I’m already writing this reply, I might as well report another strange thing I discovered while reworking the reflection and shader code a million times. If I don’t render my textures in the fragment shader, I’ll actually get the reflection effect working nicely, but when I put back the textures (for example player or grid texture), this is where things go haywire. Obviously there was a compromise and I didn’t want to remove my textures to get the bloody reflection effect to work properly on those machines. So I kept trying to diagnose the issue until it dawned on me to check that yellow exclamation mark in front of the GeForce NVIDIA icon graphics drivers of the computers.

I would like to provide a link to my release.exe so somebody could give it a go and tell me how the game runs for them but I wouldn’t know if I’d be violating any forum rules by doing this. Nonetheless, I’d like to thank the people above once again for helping me identify the problem.

Cheers guys :slight_smile:

EDIT: In case you’re wondering, this was a 3D game project for my degree in university. I’ve spoken to my lecturer about this issue hours prior to submission day and they told me they’d be giving me full mark on the reflection section regardless as this definitely sounded like a driver issue for them and not a coding one.