about the alpha test, that’s exactly what i supposed: since you are making a binary decision by thresholding the alpha value, you loose the fuzzy region where blending gives the best results… oh well…
about depth sort yes, the concept is right.
but you don’t have to do ray tracing/collision/casting/whatever: just evaluate the depth value (z coord) of your particle with some math.
then, sort the drawing of the particles from back to front, and remember to disable GL_ALPHA_TEST and enable GL_DEPTH_TEST.
to quickly implement a working solution for depth sorting i would (or rather i’m using ) use c standard function qsort.
…maybe you understood yet why with your previous test there was a square shape around the point, but let me explain it the same.
opengl creates antialiased points with a 2-pass approach (loosely speaking: i don’t mean it draw to the framebufer twice)
first, it creates a square shape with a solid color (the current glColor() state)
secont, it builds another square with the circular path, but instead storing a color, it stores a alpha value. that’s why you need GL_BLEND enabled to render antialiased primitives.
then, when it’s time to draw to the framebuffer, opengl transforms the vertex you passed between the glBegin()/glEnd() pair and obtain a eye-space point, wich has x,y and z coordinate.
opengl then blit the square to the framebuffer, and since this blitting operation follows the rules of any other opengl system call, if depth test is enabled, it draws to the screen only if the point is nearer than the current depth, wich is expressed by the depth buffer values at that point.
that’s why you need to do depth sorting, with depth ordering from back to front.
at least, with a Q2-particle-style.