Lights appearing on the same spot

[First post]

Hi all,
I’m having this weird issue with my scene where all the lights appear on the upper-right corner of my second cube. That’s what I can see at least, I’m not even sure if that’s the case.

So, I have two cubes (referred to as “objects” in my code), three lights.
In my fragment shader, the lights are an array of Light structs.
In my C++ code, the position, model, light properties, material properties are passed to the shader (via glUniformXXX()).

And there’s another issue, the third light won’t even show up.

I’m on Linux, and my code is mainly C++ but with a bit of C. Here’s some of the code:

Fragment shader (cubes):

#version 330 core
out vec4 FragColor;

struct Material {
    vec3 ambient;
    vec3 diffuse;
    vec3 specular;
    float shininess;
};

struct Light {
    vec3 position;
    vec3 ambient;
    vec3 diffuse;
    vec3 specular;
    float constant;
    float linear;
    float quadratic;
};

Light light;
in vec3 FragPos;
in vec3 Normal;
in vec2 TexCoords;

uniform vec3 viewPos;
uniform Material material;
uniform Light lights[3];

vec3 CalcPointLight(Light light, vec3 normal, vec3 fragPos, vec3 viewDir) {
    vec3 lightDir = normalize(light.position - fragPos);
    float diff = max(dot(normal, lightDir), 0.0);
    vec3 reflectDir = reflect(-lightDir, normal);
    float spec = pow(max(dot(viewDir, reflectDir), 0.0), material.shininess);
    float distance = length(light.position - fragPos);
    float attenuation = 1.0 / (light.constant + light.linear * distance + light.quadratic * (distance * distance));  
    vec3 ambient = light.ambient * material.ambient;
    vec3 diffuse = light.diffuse * diff * material.diffuse;
    vec3 specular = light.specular * spec * material.specular;
    ambient *= attenuation;
    diffuse *= attenuation;
    specular *= attenuation;
    return (ambient + diffuse + specular);
}

void main() {
    vec3 norm = normalize(Normal);
    vec3 viewDir = normalize(viewPos - FragPos);
    vec3 result = CalcPointLight(lights[0], norm, FragPos, viewDir)*lights[0].constant;
    result += CalcPointLight(lights[1], norm, FragPos, viewDir)*lights[1].constant;
    result += CalcPointLight(lights[2], norm, FragPos, viewDir)*lights[2].constant;
    /*for(int i = 0; i < lights.length(); i++)
        result += CalcPointLight(lights[i], norm, FragPos, viewDir)*lights[i].constant;*/
    // the results are the same anyways
    FragColor = vec4(result, 1.0);
} 

Light creation/rendering code:

#include "scene/light.hpp"
#include <stdio.h>

int light = 0;

char *light_prop(const char *attribute) {
    static char result[20];
    sprintf(result, "lights[%d].%s", light, attribute);
    return result;
}

GLuint create_light(glm::vec3 position, GLuint *objects, glm::vec3 view_pos, glm::mat4 projection, glm::mat4 view, glm::mat4 model) {
    GLuint program = create_program("./dat/shaders/light.vs", "./dat/shaders/light.fs");
    model = glm::translate(model, position);
    for (int i = 0; i < sizeof(objects)/sizeof(GLuint); i++) {
        glUseProgram(objects[i]);
        glUniform3fv(glGetUniformLocation(objects[i], light_prop("position")), 1, glm::value_ptr(position));
        glUniform3fv(glGetUniformLocation(objects[i], light_prop("ambient")), 1, glm::value_ptr(glm::vec3(1, 1, 1)));
        glUniform3fv(glGetUniformLocation(objects[i], light_prop("diffuse")), 1, glm::value_ptr(glm::vec3(1, 1, 1)));
        glUniform3f(glGetUniformLocation(objects[i], light_prop("specular")), 0.1f, 0.1f, 0.1f);
        glUniform1f(glGetUniformLocation(objects[i], light_prop("constant")), 0.1f);
        glUniform1f(glGetUniformLocation(objects[i], light_prop("linear")), 0.2f);
        glUniform1f(glGetUniformLocation(objects[i], light_prop("quadratic")), 0.032f);
    }
    glUseProgram(program);
    glUniform3fv(glGetUniformLocation(program, "selfColor"), 1, glm::value_ptr(glm::vec3(1.0f)));
    glUniformMatrix4fv(glGetUniformLocation(program, "model"), 1, GL_FALSE, glm::value_ptr(model));
    light += 1;
    return program;
}

void render_lights(GLuint *lights, unsigned int vao) {
    for (int i = 0; i < sizeof(lights)/sizeof(GLuint); i++) {
        glUseProgram(lights[i]);
        glBindVertexArray(vao);
        glDrawArrays(GL_TRIANGLES, 0, 36);
    }
}

Where I have an array of create_lights(), and render_lights() is called in my main loop.

Help would be greatly appreciated!
Thanks. :slight_smile:

Please don’t take this the wrong way, but there seem to be a lot of things wrong with the code you posted - or I’m misunderstanding it big time…

  • In C/C++ you cannot recover the size of an array from the pointer to the first element. That means code like:

    void some_function(GLuint* numbers)
    {
        int array_length = sizeof(numbers)/sizeof(GLuint);   // does NOT compute length of array
    }
    

    is wrong, this does not compute the length of the array. The only time this works is when directly applied to a compile time sized array:

    GLuint some_numbers[4];
    int array_length = sizeof(numbers)/sizeof(GLuint);
    

    However, as soon as you pass some_numbers to a function the array decays to a pointer to the first element and you cannot recover the array length. It must be passed as a separate function argument or if using C++ use a proper container like std::vector that tracks this for you.

  • In create_light you create a program object (program), but then set uniforms on a bunch of other program objects (objects[i]) and then finally set a few (different and fewer) uniforms on program?! I’m confused what is going on there…
    A more typical setup is to create one program object that contains code to evaluate your material parameters and apply lighting. You bind that program, configure uniforms for the light sources in the scene and viewer position then loop over all objects, setting material parameter uniforms for that object and draw the object, go to next loop iteration.
    Of course you can have more than just the one program if you start to have more varied materials (different numbers/kinds of textures), but that is moving beyond what you are doing at this point.

  • I cannot tell in which coordinate space your lighting is supposed to be calculated. You need to decide on one coordinate space, transform all relevant (surface position and normal, direction to viewer, direction to light source) entities to that space and then perform calculations. Typical choices are object space or world space. The important thing is picking one and being consistent.

  • Is render_lights supposed to draw the visualization of the light sources (white cubes) or is it supposed to apply lighting to other objects (yellow cubes) in the scene?

2 Likes

Please don’t take this the wrong way,

Yeah, that’s the problem with text-based communication… everyone takes what you say the wrong way.

I’ve been working on this for 2 years… what a noob I am. Never mind my stupidity.

  1. OK I’ll use std::vectors instead (arrays are stupid)
  2. program is the light’s program while objects is the list of object programs that need the properties/positions/whatever of the lights in the scene, so in each create_light() the light sets the uniforms for that object (as the lighting shader is the cube shader). The “different and fewer” uniforms are the light’s uniforms (selfColor is the light’s color which is white for all lights and model is the light’s model). Is what I’m doing here wrong? One program that passes uniforms to other programs, is that what you mean? Yes, I was planning to do more materials. I think the light properties will stay the same anyways since it’s the way that the shader calculates lighting that matters. (Back to the main problem I don’t want to be off-topic.)
  3. What coordinate space? I just pass the position and view position uniforms to the shader… am I missing something here or… ?
  4. It renders the lights themselves. Not the yellow cubes.

I’ll just post the whole code soon and I’ll make an edit with the link in it once I’ve done that.
EDIT: The link will probably die someday but here it is anyways: https://mtm828.github.io/uploads/tmp/opengl-project.tar.gz
The tutorial I’m following is learnopengl.com and I’m stuck at the Multiple Lights tutorial.
Honestly, it’s a terrible tutorial, but the problm is that there aren’t many good beginer-friendly tutorials out there fitting my needs.

Let me be a little more clear here about the uniforms and stuff,

  • I have Light lights [SOME_INT_HERE] in my cube shader
  • objects[] is a list of cube programs that use the cube shader
  • create_lights() loops through them running glUseProgram(objects[i]; glSetUniformBLAH(BLAH); setting all needed uniforms.
    I think I’m already doing what you suggested.

EDIT: After a debug with RenderDoc, all the uniforms seem to be getting sent… I don’t think they’re the problem.

Thanks. :grinning:

You should not have a program object for every object that you render. Instead, have one program object for each material type. The body of your main loop would then look something like this:

foreach materialtype:
    bind program object for materialtype
    set object and material independent uniforms (view, projection matrix, light source parameters)
    
     foreach material of materialtype:
          set material parameter uniforms (colors, textures, etc)
          
          foreach object using material:
              set object specific uniforms (model matrix, combined modelviewprojection matrix etc)
              draw object

Switching program objects is an expensive operation (performance wise) compared to changing uniforms, so if many objects use the same program object just with different parameters you want to only pay the lower cost of changing the uniforms instead of switching program objects.

Vertex positions are stored in object space (relative to an objects origin). The model matrix of each object is used to transform from object space to world space allowing you to place a model in your scene (without having to modify the vertex positions themselves). The view matrix transforms from world space to view space where the camera is located at the origin. The projection matrix transforms from view space to clip space - the vertex shader must emit gl_position in this coordinate space (typically by applying the modelviewprojection matrix to a position vertex attribute). The hardware then performs perspective division to get normalized device coordinates (NDC).

So when you say you pass the view position as a uniform to the shader it is presumably in world space. That means position must also be in world space otherwise you cannot meaningfully subtract them from each other to get the direction from the surface point to the view (which goes into the lighting computations). Your vertex shader should therefore output a varying (I think you call it fragPos) that is the position vertex attribute transformed by the model matrix (so you get world space position of the vertex).

Ok. I’ll be a bit pedantic and say it renders a (debug) visualization of your light sources. Light sources themselves are generally not visible (only their effect on other objects in the scene) because they are usually calculated using idealized shapes (points or directions). Note in particular in the pseudo code I outlined above there is no draw call associated with the lights - they are passed as uniforms to the materialtype - of course, in order to have lights visualized as small cubes you do need to emit a draw call.
It’s useful to have these visualizations but part of my confusion looking at your code was because your create_light function deals with both these aspects. I feel it would be clearer to separate these things.

1 Like

Thank you, Carsten.
Although you’ve pointed out some really useful information there, I still haven’t found a real fix to this problem.
I can do performance optimisation /code cleaning later.
Yeah, they are being drawn, render_lights() is called in the main loop.
create_lights() only does that because I’m trying to easily generate a scene, create_lights()
Even with std::vectors only one light works.