Weird ImageStore Behaviour

I found that my fragment or compute shader cannot imageStore anything to the multi-channel image texture initialized in this way.

Method A:

	
        glGenTextures(1, &tex);
	glBindTexture(GL_TEXTURE_2D, tex);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, 0);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_LEVEL, 0);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA32F, imgWidth, imgHeight, 0, GL_RGBA, GL_FLOAT, NULL);

I got an all zero buffer when I read back using glGetTextureImage.

However, when I changed the initialization to this:

Method B:

	
        glGenTextures(1, &tex);
	glBindTexture(GL_TEXTURE_2D, tex);
	glTexStorage2D(GL_TEXTURE_2D, 1, GL_RGBA32F, imgWidth, imgHeight);

The problem was fixed.

(For both, I bind the texture to the image unit by

	
	glBindImageTexture(1, tex, 0, GL_FALSE, 0, GL_READ_WRITE, GL_RGBA32F);

)

The weird thing is that method A works when I use single channel texture (e.g. GL32F).

Is that incorrect use of API or is it the driver’s problem?

Thanks.

Are you using glMemoryBarrier?

I added glMemoryBarrier(GL_ALL_BARRIER_BITS) between the draw call and the glGetTextureImage call, but still got a zero image when using initialization method A.

It looks like it might be a driver bug. Which GPU and driver are you using?

If you post a short, standalone test program, folks here could try your code and give you testing results, as well as help check that you’re not missing something important.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.