I have a program that runs on an abstraction layer which allows the application to execute as either an OpenGL or a DirectX program. Recently I’ve been implementing a feature that requires the use of point sprites, and in doing so I’ve run across a strange problem. In a vertex program I set the size of the points with the Cg PSIZE register(Which apparently is the number of samples the point is in X by X). So in DirectX when I force the antialiasing on the point size on screen shrinks. This makes sense to me, and I see how one could adjust for it. However when I run under opengl and force antialiasing the point size remains the same. This is rather odd. I suppose I could scale only if I detect I’m running under DirectX, and if theres nothing else going on I will, but I’d like to understand why there is a difference in behaviour between the two APIs.
Sounds like a bug in your D3D driver to me. The point size shouldn’t change depending on whether you have multisampling or not, so the GL behavior makes sense to me.