Why Fewer Depth(Z) Bits on Dell Dimension 8100 than on Crummy Laptop

I developed and tested an OpenGL simulation on a 133MHz Toshiba 440CDT Laptop, which ought to be in the Simthsonian Museum soon - The laptop I mean, perhaps also my simulation! but I digress …

I’m currently running it on a Dell Dimension 8100 Desktop (Pentium-4 1.3GHz, 256 Mb RDRAM, 32Mb Nvidia GeForce2 MX card) - in a nutshell, a Ferrari compared to the Yugo of a laptop.

The simulation works fine and is obviously much much faster, and the colors are just fine, but I noticed that I only have 16 Depth (Z-buffer) bits as opposed to 32 on the laptop.

Specifically,

glutGetIntegerv(GLUT_DEPTH_BITS,&depthbits);

returns depthbits=16 on the Dell and 32 on the Laptop.

Please bear in mind that I’m only talking about the DEPTH BUFFER BITS, NOT the color bits (I’m using 32 bit true color anyway). I bring this up because I got several well-intentioned, but irrelevant responses regarding color settings when I posted this originally in comp.graphics.api.opengl

According to the Nvidia GeForec2 specs, the card should support 32 Z/Stencil Buffer bits. So why is the code only being allocated 16 Z-buffer bits?

Is there some way to fix this using the SetPixelFormat() and related routines in windows? I havent tried it yet, cos I suck at Windows programming.

Or is there a quicker fix using glut

Any help in this regard will be deeply appreciated

Thanks and regards

-Sharat (sharat@playful.com)

[This message has been edited by sharat (edited 02-20-2001).]