Z-buffer problems

I’m experiencing terrible z-buffer artifacts (huge bands all over polygons) in 16-bit color mode on TNT 64 Pro and GeForce 2 GTS. They disapear completely in 32-bit color mode. The z-buffer resolution is the same in both cases. Changing it doesn’t make any difference. Fiddling with znear/zfar ratio doesn’t make any difference.

How can the color depth affect the z-buffer?
How can I get rid of the problem?
Any help appreciated, lenghty explanations by all means welcomed.

On the TNT series, in 16 bit color you get only 16 bits for z-beffer depth. In 32 bit color, you get 24 bits for z-buffer depth and 8 bits for stencil buffer depth.

OK, but this also happens on GeForce 2 GTS. Is what you said valid for this card as well?

On GeForce 2 GTS it’s not much of a problem, since it’s goddamn fast in 32-bit color mode, whereas TNT sucks big time and only performs well in 16-bit color mode.

I dont quite understand… what makes you think this is a ZBuffer issue and not a color buffer issue? At the company I work for, our current title is being developed concurrently for both the PC and the PS2. We are running in 16bit color mode in both cases and using 16 bit textures. Many of our textures are banded and we are currently working on workarounds. This happens on both the PC AND PS2! It also has nothing to do with the ZBuffer. Your problem might be different though.

The geforce2 gts (and almost all modern consumer cards) use 16bit zbuffer when in 16bit color mode.
Yes, 16bit color causes other artifacts as well, you’re right, but in most cases (single or dual pass rendering) 16bit color introduces much less artifacts.
The best way to reduce artifacts in your code when in 16bit zbuffer is to set the near z clipping plan as far away as posible. Also keep the far plane as near as posible.

beavis:
if your app does varios passes, uses blending, etc, then our ps2 pal is right, your problem seems to be more related to color depth than to zbuffer depth.

[This message has been edited by coco (edited 01-11-2001).]

Also make sure dithering is enabled.

  • Matt

Call me crazy, but “huge bands” sounds like a job for Mr. Polygon Offset. Granted, the performance penalty (depending on your hardware) may moot this suggestion, but give it a try.

glEnable(GL_POLYGON_OFFSET_FILL);
glPolygonOffset(1.0, 1.0);

For appropriate values for your implementation, check out: http://trant.sgi.com/opengl/docs/man_pages/hardcopy/GL/html/gl/polygonoffset.html

<shrug>

Glossifah