I want to create a Render Target of a Shadow Map but an unhandled exception at 0x0000000000000000

There is a crash (unhandled exception at 0x0000000000000000) on the line glGenFramebuffers(1, &framebufferName);, and I mentioned in a comment above it:
#if 1 // I'm not sure where should I place this block of #if, because if I set that number to 0 then there is no crash.

Note: When I say, I crash on a line, I say that when I press F10 on that line then an unhandled exception, that’s because the error line will stop after that line.

Is there an alternative of Vulkan Validation Layer in OpenGL? I tried to add GLenum e = glGetError(); just after the lines with crashes (glGenFramebuffers() or glFramebufferTexture()) but the program cannot even evaluate glGetError() because of an unhandled exception before it. How to detect such error in OpenGL because there is no information about the errors?

Goal:
I want to create a Render Target of a Shadow Map. But I always crash either on glGenFramebuffers() or on glFramebufferTexture() sometimes, and I have no idea how to fix it?

Code referenced from “imgui/example_sdl_opengl3/main.cpp”

Link: https://github.com/ocornut/imgui/blob/master/examples/example_sdl_opengl3/main.cpp
And the link I want to apply in the #if block: Tutorial 16 : Shadow mapping

int main(int, char **) {
    if (SDL_Init(SDL_INIT_VIDEO | SDL_INIT_TIMER | SDL_INIT_GAMECONTROLLER) != 0) {
        printf("Error: %s\n", SDL_GetError());
        return -1;
    }

    // Decide GL+GLSL versions
#if defined(IMGUI_IMPL_OPENGL_ES2)
    ...
#elif defined(__APPLE__)
    ...
#else
    // GL 3.0 + GLSL 130
    const char* glsl_version = "#version 130";
    SDL_GL_SetAttribute(SDL_GL_CONTEXT_FLAGS, 0);
    SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE);
    SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
    SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 0);
#endif

    // Create window with graphics context
    SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
    SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);
    SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 8);
    SDL_WindowFlags window_flags = (SDL_WindowFlags)(SDL_WINDOW_OPENGL | SDL_WINDOW_RESIZABLE | SDL_WINDOW_ALLOW_HIGHDPI);
    SDL_Window* window = SDL_CreateWindow("Dear ImGui SDL2+OpenGL3 example", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 1280, 720, window_flags);
    SDL_GLContext gl_context = SDL_GL_CreateContext(window);
    SDL_GL_MakeCurrent(window, gl_context);
    SDL_GL_SetSwapInterval(1); // Enable vsync

    // Setup Dear ImGui context
    IMGUI_CHECKVERSION();
    ImGui::CreateContext();
    ImGuiIO& io = ImGui::GetIO(); (void)io;
    //io.ConfigFlags |= ImGuiConfigFlags_NavEnableKeyboard;     // Enable Keyboard Controls
    //io.ConfigFlags |= ImGuiConfigFlags_NavEnableGamepad;      // Enable Gamepad Controls

    // Setup Dear ImGui style
    ImGui::StyleColorsDark();
    //ImGui::StyleColorsClassic();

    // Setup Platform/Renderer backends
    ImGui_ImplSDL2_InitForOpenGL(window, gl_context);
    ImGui_ImplOpenGL3_Init(glsl_version);

    // Our state
    bool show_demo_window = true;
    bool show_another_window = false;
    ImVec4 clear_color = ImVec4(0.45f, 0.55f, 0.60f, 1.00f);

    #if 1 // I'm not sure where should I place this block of #if, because if I set that number to 0 then there is no crash.
    {
        // Originated from [Basic OpenGL > "tutorial16_shadowmaps"](https://www.opengl-tutorial.org/intermediate-tutorials/tutorial-16-shadow-mapping/)

        // The framebuffer, which regroups 0, 1, or more textures, and 0 or 1 depth buffer.
        GLuint framebufferName = 0;
        glGenFramebuffers(1, &framebufferName); // <-- Error: Exception has occurred: W32/0xC0000005  Unhandled exception at 0x0000000000000000 in (...).exe
        glBindFramebuffer(GL_FRAMEBUFFER, framebufferName);

        // Depth texture. Slower than a depth buffer, but you can sample it later in your shader
        GLuint depthTexture;
        glGenTextures(1, &depthTexture);
        glBindTexture(GL_TEXTURE_2D, depthTexture);
        glTexImage2D(GL_TEXTURE_2D, 0,GL_DEPTH_COMPONENT16, 1024, 1024, 0,GL_DEPTH_COMPONENT, GL_FLOAT, 0);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); 
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_COMPARE_FUNC, GL_LEQUAL);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_COMPARE_MODE, GL_COMPARE_R_TO_TEXTURE);
            
        glFramebufferTexture(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, depthTexture, 0);

        // No color output in the bound framebuffer, only depth.
        glDrawBuffer(GL_NONE);

        // Always check that our framebuffer is ok
        if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE) {
            printf("Error\n");
            return -1;
        }
    }
    #endif

    // Main loop
    bool done = false;
    while (!done) {
        ...
    }

    ...
    return 0;
}

This usually indicates a null function pointer. Where is glGenFramebuffers defined? If it’s defined as a pointer, is the pointer being initialised?

On Windows, opengl32.dll only exports the OpenGL 1.1 API. For anything else, you need to use a loader (GLEW, GL3W, etc) which will declare OpenGL functions as function pointers, but you need to call the loader’s initialisation function to initialise those pointers. On Linux, a loader isn’t necessary, but if you’re using one you have to initialise it.

1 Like

Thank you, it really helped me, I looked at glewInit() but I was wondering why only glGenFramebuffers() is ready, but glFramebufferTexture() is a null pointer?

The solution is to add the following line before calling glewInit():
glewExperimental = true; // Needed for core profile

I found that from “tutorial16_SimpleVersion.cpp” in the link I sowed. The reason I post this even if the solution is solved is because it was not solved when I make that photo but I suddenly found the solution. And who knows, maybe everything I post on my threads will help me in a future?

glFramebufferTexture was added in GL 3.2. You could use glFramebufferTexture2D instead.

1 Like

Thanks, that’s very helpful because glewExperimental means experimental which means that not an official feature, but it’s better to use the official one like you said: glFramebufferTexture2D().

Setting glewExperimental skips checking whether the function is supposed to be supported based upon the version and/or extension list and just tries to obtain the pointer regardless.

In earlier versions of GLEW, setting glewExperimental was required when using a core profile because the extension check used glGetString(GL_EXTENSIONS) which isn’t supported in the core profile (you have to use glGetStringi to query extensions). This has since been fixed.

On Linux/X11, you can’t rely upon testing function pointers alone because glXGetProcAddress will return a non-null pointer so long as libGL.so exports the symbol, regardless of whether the driver (or X server) implements the function. So you either need to check the version and/or extensions or have the loader do it for you (by not setting glewExperimental).

You don’t need to use glFramebufferTexture unless you want to attach all faces and/or layers of a 3D texture, cube map, or array texture as a layered framebuffer. For a non-layered framebuffer, you can use one of the glFramebufferTexture[123]D functions or glFramebufferTextureLayer (all of which are in 3.0, which is the first version to support FBOs in core). For cube maps, 3D textures and array textures, if you only want to attach a single face or layer then you can’t use glFramebufferTexture as that will attach all faces/layers.

1 Like

GClements > Thank you very much, I’m making the shadow map now. Thank you for the information, I will think about it when making the shadow map.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.