gl_PointSize and GL_REPLACE_COORD_ARB under Linux

Hello,

I’ve got a vertex shader that computes the point size for GL_POINTS and writes it in gl_PointSize. The fragment shader then attempts to texture the point sprites using the texture coordinates that result from setting GL_REPLACE_COORD_ARB to GL_TRUE.

Problem is, the texture coordinates do not range from [0, 0] → [1, 1] over each point sprite as expected with GL_REPLACE_COORD_ARB turned on. However, if I do not write to gl_PointSize in the vertex shader, the texture coordinates are correct. I need to modify the point size in the vertex shader and texture each point sprite with coordinates in [0, 0] → [1, 1].

I’m running SuSE Linux 9 with the Nvidia 81.xx drivers on a GeForce 6800. Any insight into this issue would be appreciated.

Thanks!

Hello again,

False alarm: I found the problem while stripping down to full application to simpler code that attempted to reproduce the error. As it turns out, the order in which textures are loaded matters, and it’s different under Linux and Macs.

For those of you interested: On the Mac, I load a 1D color map texture first, and then the 2D normal map to produce the results I expect. This order did not work under Linux, but loading the normal map first, followed by the color map, did. I’m surprised that order makes a difference, because I was explicitly activating different texture units before executing glTexImage1D/2D, and then binding the fragment shader 1D/2D samplers to the appropriate unit. I’m not yet well versed in OpenGL, so perhaps this isn’t a surprise to experienced coders.

In any case, attempting to reproduce errors with the simplest possible code is a very useful exercise. I’ll have to keep that in mind from now on.

Thanks for the help,

Christiaan

If the textures are loaded to two different texture image units and you have made sure texture samplers have been set while the GLSL program was in use (glUniform has no program parameter, means it only works on the currently bound program) then order of texture units doesn’t matter.
It does, if you happened to load 1D and 2D textures to the same unit. 2D has precedence over 1D on texture units (the ones which react on glEnable/glDisable).
Not sure what weirdness a texture image unit does then. I would expect trouble when sourcing texture1D and texture2D from two samplers with the same image unit id, but maybe the driver handles thatsomehow. Never tried. :wink:
Texture environment state is set per texture unit. Make sure you have it set for the correct one.

You’re exactly correct. In my OpenGL ignorance, I failed to realize that TexEnv calls were per-unit. It was a simple fix to correct the issue.

Thanks for the help.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.