I’m using GL_POINTS and want to glEnable(GL_POINT_SMOOTH) but if and only if the point antialiasing is done in hardware - otherwise this statement will absolutely kill performance in my pointsy application.
One of the hardware vendors I support has this characteristic - nice acceleration in general, but no AA points.
How to detect?
Not sure if you can tell if they’re supported in hardware, other than using empirical evidence. But the simple workaround that I’ve used is just avoiding GL_POINTS altogether. In my case, GL_POINTS were particularly problematic: inconsistent appearance from one OS/hardware platform to the next, and larger points (using ARB_point_sprite, allowing for arbitrary 2D textures to be rendered with GL_POINTS) are noticeably clipped in their entirety as soon as the center of the point sprite falls outside the view (not a problem for tiny sprites).
The simple workaround is to do it yourself, and in most cases you’ll have far fewer headaches than dealing with GL_POINTS. If you want consistent hardware support and consistent appearance, it boils down to:
- submit each sprite’s geometry as a set of 4 vertices, each of which is positioned at the sprite’s center (for a lot of sprites, use glDrawArrays and pass them in a single call); the geometry must be submitted with an attribute that tags each vertex (an int from 0 to 3, effectively a “corner ID”) so as to distinguish the corners from one another
- setup a program local parameter array of 4 offsets, corresponding to the area of the sprite
- process your sprites’ vertices with a vertex program that picks up the vertex’s “corner ID” attribue and offsets it according to a lookup in a program local parameter array (the values in the array are used to offset the initally coincident vertices to the various corners of the sprite)
Thanks for the response. Would this be more costly than GL_POINTS?