I have big problem with ZBuffer…using some pixelformat the output of my engine doesn’t update correctly the ZBuffer for example i have tried my engine in a Gforce4mx and using a pixelformat of color:32 depth:32 stencil:0 the polygon are drawn wrong…the same scene using a pfmt of color:32 depth:24 stencil:0 is OK!!
WHY? I have the same problem with G400 but in this case the only pfmt that work properly is without doublebuffer!!!
Seems tha usign glpolygonoffset something change but i don’t have the correct Z Value in the Depthbuffer!!!
Any suggestions? How can i obtain the correct pixelformat ?
You can take a look at the last message in beginners first if you want.
ChoosePixelformat tries to match your requests, but it doesnt succeed all the time. Asking for 32bit color is strange, since color = RGB. Alpha bits has its own place in the structure, so you basically asks for some 11,11,10 bit format . nVidia cards cant have more colorbits in opengl than the current desktopmode is.
32bit depth is not very common, i cant talk for g400 but no nVidia card up to date ( have not checked FX) has more than 24 bit depth.
After ChoosePixelformat, run the formats number through DiscribePixelformat and see what you actually have, dont believe that you get what you asked for.
another solution than choose pixelformat is to discribe all pixelformats there is, and make some sort routine that picks the correct one for you.
I think the problem is, when you ask for 32bit Z, windows picks 16bit Z if its not available, instead of 24.
Nvidia hardware supports 16bit and 24bit Z typically. Check what value you are getting back. IF you dont get 32bit, try and request 24bit before defaulting to 16bit.
If you ask for stencil
(32 bit z-buffer and 8 bit stencil)
you do get 24 bit and 8 bit so you avoid the problem.
Maybe performance-wise it
s better to have the stencil buffer as well, and to clear the z and stencil together instead of just the stencil. Thats what a document said but I`m not sure if it was from NVidia.