Point sprites visual glitches

Hello,

Why there are those white pixels on point sprite textures?

This is the 32 bit image:

star

Vertex:

        void main()
        {
            gl_Position = ftransform();
        }

Fragment:

        uniform sampler2D tex;

        void main()
        {
            vec4 finalColor = texture(tex, gl_PointCoord);

            if (finalColor.w == 0)
                discard;

            gl_FragColor = finalColor;
        }

Drawing:

        gl.Enable(GL_POINT_SPRITE);
        gl.TexEnvi(GL_POINT_SPRITE, GL_COORD_REPLACE, gl.TRUE);

        Draw();

        gl.Disable(GL_POINT_SPRITE);

Thanks,

Alberto

Most likely because the fully transparent parts of your texture contain values (1.0, 1.0, 1.0, 0.0), so when linear interpolation around the edges causes the alpha value to be > 0, passing the alpha test in your fragment shader you get a value that is a mix of blue and white.
Use textures that store values with pre-multiplied alpha and corresponding blend mode or pad the edges of opaque areas with the nearby color to avoid this.

This is usually caused by the transparent colour bleeding into pixels with partial alpha as a result of linear filtering or mipmap generation. The transparent colour for this image is white. Either set the transparent colour to match the rest of the image (0,0,178) or convert to pre-multiplied alpha after loading (and change the blending mode accordingly).

Thanks!

Ah, my image editing tool does not allow me to change the transparent color color. Do you know a tool that handles this?

I’ve changed RGB of transparent color by code from 255,255,255 to 0,0,255 in the bitmap without seeing any improvement… :frowning_face:

Did you change the blending mode?

It should be 0,0,178. 255 is too bright. Also, try changing it to something completely different like 255,0,0,0 just to see if you’re actually changing what you think you’re changing.

That isn’t necessary if the transparent colour is changed to match the rest of the image (which is 0,0,178). Note that changing the blending mode requires converting the image to use pre-multiplied alpha, not just fixing the transparent colour.

How do I accomplish this? Can you link a page where I can learn more about it?

  1. Replace [r,g,b,a] with [a*r,a*g,a*b,a]. Any pixel with zero alpha will be [0,0,0,0].

  2. Change the blending mode to glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA). The source colour has already been multiplied by its alpha so you don’t need GL_SRC_ALPHA as the source factor.

A was unable to resolve this.

Currently, I am processing the image this way. The source bitmap has full ‘blue’ areas, full ‘transparent’ areas, and some antialiased pixels between them.

    public static Bitmap ChangeColor(Bitmap source)
    {
        Bitmap dest = new Bitmap(source.Width, source.Height);

        for (int i = 0; i < source.Width; i++)
        {
            for (int j = 0; j < source.Height; j++)
            {
                Color prevColor = source.GetPixel(i, j);
                
               if (prevColor.A > 0)
                   dest.SetPixel(i, j, Color.FromArgb(prevColor.A, prevColor.R * prevColor.A/255, prevColor.G * prevColor.A/255, prevColor.B * prevColor.A/255));
               else
                   dest.SetPixel(i, j, Color.FromArgb(0,0,0,0));
            }
        }

        return dest;
    }

What am I doing wrong? I’ve tried standard blending and glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA) blending.

Thanks,

Alberto

No idea. That function is correctly converting to pre-multiplied alpha. Based upon the original image, every pixel should have r=0, g=0, b=178*a/255. IOW, there shouldn’t be any white pixels in the image (write a test to check this). If you’re still seeing them, I’d guess that you’re uploading the original image rather than the fixed copy.

Are you drawing the points last? You can’t rely upon depth testing if you’re using blending. You can use depth testing with alpha testing (i.e. treating the alpha channel as a binary mask), but not with blending. If you’re using blending, anything with partial alpha has to be rendered from back to front.

If you try to use depth testing with blending, any fragment with partial alpha will get blended against the framebuffer contents at the time it is rendered and update the depth buffer. Anything drawn behind it will then get discarded (and not affect the pixel colour).

I still have b=255, should I lower it down to 178?

Is this performing integer or floating point math?

Also, completely unrelated: when processing image data I would recommend making the outer loop over y (height) so that the inner loop processes adjacent pixels in a row which are much more likely in cache.

Suggestion. Remove this:

Get it working perfect. Then re-add it if you must.

If problems, this’ll give you better pics to post and make the problem easier to troubleshoot.

How did you get b=255? In the original image (in the first post), the only colours are (0,0,178,a) and (255,255,255,0).

Are you drawing your point sprites with depth test and depth write enabled, and (particularly) drawing them before your other geometry?

How did you get b=255?

Sorry, my mistake. I meant Alpha of blue pixels.

Here is the drawing code:

        gl.Enable(GL_BLEND);
        gl.Enable(GL_POINT_SPRITE);
        gl.TexEnvi(GL_POINT_SPRITE, GL_COORD_REPLACE, gl.TRUE);

        Draw();

        gl.Disable(GL_POINT_SPRITE);
        gl.Disable(GL_BLEND);

Vertex shader:

        void main()
        {
            gl_Position = ftransform();
        }

Fragment shader:

       uniform sampler2D tex;

       void main()
       {
           vec4 finalColor = texture(tex, gl_PointCoord);

           if (finalColor.w == 0)
               discard;

           gl_FragColor = finalColor;
       }

Bitmap color change code:

    public static Bitmap ChangeColor(Bitmap source)
    {
        source.Save("source.png", ImageFormat.Png);

        Bitmap dest = new Bitmap(source.Width, source.Height);

        for (int i = 0; i < source.Width; i++)
        {
            for (int j = 0; j < source.Height; j++)
            {
                Color prevColor = source.GetPixel(i, j);
                
               if (prevColor.A > 0)
                   dest.SetPixel(i, j, Color.FromArgb(prevColor.A, (int)(prevColor.R * prevColor.A/255.0), (int)(prevColor.G * prevColor.A/255.0), (int)(prevColor.B * prevColor.A/255.0)));
               else
                   dest.SetPixel(i, j, Color.FromArgb(0,0,0,0));
            }
        }
       
        dest.Save("dest.png", ImageFormat.Png);

        return dest;
    }

Dest bitmap:
dest

Source bitmap:
star

The conversion is fine. So I’m still inclined to suspect that you aren’t drawing them last.

My apologies, I originally didn’t see that you’d suggested this before I did. But yes, that’s clearly what’s happening here. The point sprites are being drawn first, they’re alpha blending with the white background, then when the other geometry is drawn they get white fringes.

Yes, I wasn’t drawing points as last.

Resolved, thank you!