Point sprite texture coordinates in fragment shader

My problem probably has a simple solution to it so I will try and get straight to the point. I am using the point sprites extension for a particle system and so I am passing one vertex per particle to the vertex shader. If I have understood it correctly, during the rasterization step a lot of fragments are created from this single vertex (since GL_POINT_SPRITE_ARB is on), according to the current point size. My problem arrises when i try to implement my fragment program, and I have to assign values to all the fragments, based on a texture. I simply don’t know where my texture coordinates are! Do I have to take care of sending them myself from my vertex program?

Here is some code, it is not much, and it doesn’t work:

// Multiply texture value by current color
// testTexture is a uniform sampler2D
gl_FragColor = gl_Color*texture2D( testTexture, gl_TexCoord[0].st );
// This is basically what I want my fragment shader to do.

Here how I handle texture’s in my main app:

glGenTextures(1, &textureID);                   
glBindTexture(GL_TEXTURE_2D, textureID);             glfwLoadTexture2D("particle_w.tga",GLFW_BUILD_MIPMAPS_BIT); 

Does anything more have to be done?

Please let me know if anything is unclear and I will clarify… I know that the standard behaviour for the fragment shader manages the texturing just fine! I have seen it!

/ Tommy

The texture coordinates are generated automatically. You don’t need to write them from the vertex shader. In fact, it wouldn’t be possible since the vertex shader is only run once. What card are you trying to do this on btw?

yes the coordinates are generated automatically. The problem is, how do I access these coordinates in a fragment program?

My card is GeForce FX5900.

The problem is that the fragment program has to write to gl_FragColor and since I am using a texture, the color of the fragment will depend on this texture. If I could just fish out the texture coordinates I think I would be fine! =)

Thanks for the help! I had been trying to send gl_TexCoord[0].st to texture2D. If I send gl_TexCoord[0] only it seems to work fine!

So the final single line of code is:

// v_color is varying and set to gl_Color in vertex program
gl_FragColor = v_color*texture2D( testTexture, gl_TexCoord[0] );

Hmm, that’s a bug. First, gl_TexCoord[0].st is the correct way and should work. Secondly, using gl_TexCoord[0] without .st is incorrect and shouldn’t compile at all.

it seems much more logical to write gl_TexCoord[0].st! I am a little worried here. I have to leave errors in my code, or my program won’t work, but if changes are made to fix this bug, then my program won’t compile in the future?

texture2D must be used with a vec2, so either gl_TexCoord[0].st or vec2(gl_TexCoord[0]) is the right thing to do.
There must be a compilation error with the program you use and if that’s the case you might run with the fixed pipe which happens to look like you expected?
Check the compilation and link status and the information logs inside your application.

I also have a problem concerning point sprites and GLSL. Using the point sprites without GLSL works fine, but when I activate my shader code, the point sprites will not rescale, so all points, regardless of position, are rendered the same size.

Are you writing to gl_PointSize?

gl_PointSize in the shader only makes a difference if glEnable(GL_VERTEX_PROGRAM_POINT_SIZE_ARB) has been called; otherwise it ignores it. At least it does on my ATI card.

A related question - what range of texture coordinates should you see in the fragment shader? I am doing gl_FragColor = gl_TexCoord[0], and although each sprite is coloured differently, I was hoping to see the colours change within one sprite (to check that I can use texcoords for texture lookup).

Any ideas?

Thanks in advance,



When You write this line You’ll get default texcoords on point sprite (ranging from 0 to 1). You can change this. But I really don’t remember the line (check the specification). It works on both ATI and nVidia (I checked it).

I’ve got another question concerning this topic. Is there any way to override clipping of point sprites? It really looks weird when they are clipped when touching the border of the screen.

Another question. I found a simple bug on ATI cards. so I’ll report it now:

when GL_LINE_SMOOTH is enabled and you run a GLSL program, the performance drops to about 0.1 fps. I think It was already reported but just in case…

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.