I am currently doing a project for a course in “Advanced game programming”. I started out implementing a technique for Order Independent Transperency and since I had a week to go I thought I vould try a simple version of Depth of Field simulation. I read an article where they rendered to different textures depending on the depth. Since I have all the fragments adn their corresponding depth in a linked list i thought it would be an easy method to implement.
Since the per pixel operation of sorting and rendering doesn’t need any primitives I am doing it in a compute shader, calculating the final result and storing it in an image2D works fine and then rendering the texture to the screen using a simple fragment shader and a quad.
The problem is that I want to have a more dynamic version (and maybe lots of layers for better depth of field simulation), therefore I am trying to render to an image2dArray, which is not working at all.
// layouts layout (binding = 3, rgba32f) writeonly uniform image2D out_texture; // working layout (binding = 3, rgba32f) writeonly uniform image2DArray out_texture; // not working // when rendering imageStore(out_texture, ivec2(gl_GlobalInvocationID.xy), diffuse); // working imageStore(out_texture, ivec3(gl_GlobalInvocationID.xy,0), diffuse); // not working
Since I am right not only trying the simple version with image2DArray as a one layer array I cannot see what I am doing wrong.
// allocating the textures glGenTextures(1, &dof_textures); glBindTexture(GL_TEXTURE_2D_ARRAY, dof_textures); glTexImage3D(GL_TEXTURE_2D_ARRAY, 0, GL_RGBA32F, MAX_FRAMEBUFFER_WIDTH, MAX_FRAMEBUFFER_HEIGHT, 1,0, GL_RGBA, GL_FLOAT, NULL);
I’m also fetching the texture to verify that it is not the final rendering that fails, the data never gets set from the compute shader when using the image2DArray.
What am I doing wrong? Is there any other solution to my “many output texture” need?