I’m developing a 2D GUI in Java using the LWJGL OpenGL binding, and I’ve run into a problem that I think I’ve isolated to a lack of subpixel precision. Querying GL state on my laptop’s 8MB S3 Savage/IX reports 3 bits of subpixel precision, but the OpenGL spec dictates a minimum of 4.
Is this a common occurance in drivers/hardware today? Do manufactures stick as closely to the spec as they probably should?
Is there any way for me to confirm that this is indeed true?
This sounds bad. I think most consumer hardware has 4 bits precision, while “expensive” HW (3Dlabs) has something like 10 bits.
You should be able to device a test program with which you can determine if the precision is 3 or 4 bits. For instance, draw sufficently long lines with some angle between two points on the screen, and vary the screen coordinates of one end point with steps of 1/2^4 pixel. If some pixel(s) of the line changes for every step, you have 4 bits precision… Anyway, you get the general idea.
You can also try moving a smooth shaded quad in sub pixel steps, read back the rendered image with glReadPixels, and detect changes in actual motion by forming sum(abs(this_image(x,y)-precious_image(x,y))).