Problem outputting uvec4 to gl_FragColor

Hello all,

I am trying to use the GL_EXT_texture_integer and GL_EXT_gpu_shader4 extensions with a simple application. I create an unsigned integer RGB texture, attach it to an FBO and I am trying to write from my fragment shader to gl_FragColor.

My OpenGL code seems fine and am able to attach the uint texture to the FBO. My problem seems to stem from not being able to output a uvec4 value to gl_FragColor. I get the following error from my simple fragment shader:

(5) : error C7011: implicit cast from “uvec4” to “vec4”

#version 120
#extension GL_EXT_gpu_shader4 : enable
void main()
   gl_FragColor = uvec4(1u,1u,1u,1u);

If I bind my FBO first and then link and attach my shader, then it goes from an error to a warning, but with the same end result (undefined output).

Haven’t tried it under linux yet, but on my XP partition I am using the 169.21 driver for my nVidia 8800 GTS.

Has anyone been able to successfully write to an integer texture from within a fragment program? Any help would be appreciated.


You can’t write non-floats to gl_FragColor - instead look into using BindFragDataLocationEXT, as defined in EXT_gpu_shader4. More specifically, look at issue #9. :wink:

Now, if only it was as clear about how integer inputs are supposed to come into the fragment shader. Like from DrawPixels.

What does DrawPixels have to do with a fragment shader?

As far as I know, every "pixel"t from DrawPixels counts as a fragment. So basically, it should run through the shader too. Never tried it though.

I tried using this function in an experiment to see if gl_Color would be redefined as an integer vector, and I discovered that most of the formats I tried resulted in some kind of software fallback with DrawPixels… never using that function again! :frowning:

I have an 8800 GTS.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.