I would have the following question:
How I can figure out the maximum Z precision of a card?
I mean what is the maximum value of the cDepthBits field of the PIXELFORMATDESCRIPTOR structure.
Anybody?
I am sure that the z precision is somehow connected to the colour depth. I do not understand why I can not set higher precision in 16bpp mode. (I mean I use less memory, so there must be place for Z precision)
Below is a part of my log.
When I call setpixelformat cDepthBits is always 32.
Here’s what I could set up (NVIDIA TNT2 - DETONATOR 3):
I have changed to a 16 bpp mode
Opengl::SetPixelFormat -Start
ColorBits:16
DepthBits:16
AccumBits:64
StencilBits:0
Opengl::SetPixelFormat - End
I have changed to a 32 bpp mode
Opengl::SetPixelFormat -Start
ColorBits:32
DepthBits:24
AccumBits:64
StencilBits:0
Opengl::SetPixelFormat - End
thanks,
mandroka
[This message has been edited by mandroka (edited 09-07-2000).]
Hi !
I don’t know if this helps but you can get all the available pixel formats that your driver supports with DescribePixelFormat(). Then you can choose the one that you like. I have a program that lists all the pixelformats supported for each driver. If you want it send me an email.
Actually what I wanted is to figure out the maximum precision, because my terrain renderer engine has a lot of Z fighting bugs. When I set higher precision these are gone.
Actually, I asked someone at nvidia, and he told me that in 16 bpp mode the max z depth is equal to the colordepth.
you could try to enumerate through all the pixelformat descriptors until you find one with the greatest z buffer precision
Originally posted by mandroka:
[b]Hi!
I would have the following question:
How I can figure out the maximum Z precision of a card?
I mean what is the maximum value of the cDepthBits field of the PIXELFORMATDESCRIPTOR structure.
Anybody?
I am sure that the z precision is somehow connected to the colour depth. I do not understand why I can not set higher precision in 16bpp mode. (I mean I use less memory, so there must be place for Z precision)
Below is a part of my log.
When I call setpixelformat cDepthBits is always 32.
Here’s what I could set up (NVIDIA TNT2 - DETONATOR 3):
I have changed to a 16 bpp mode
Opengl::SetPixelFormat -Start
ColorBits:16
DepthBits:16
AccumBits:64
StencilBits:0
Opengl::SetPixelFormat - End
I have changed to a 32 bpp mode
Opengl::SetPixelFormat -Start
ColorBits:32
DepthBits:24
AccumBits:64
StencilBits:0
Opengl::SetPixelFormat - End
thanks,
mandroka
[This message has been edited by mandroka (edited 09-07-2000).][/b]
My routine does the same. Well, I set the desired z depth value, and it choses the next best matching pixelformat that is higher or equal, but in our cards (mostly nvidia) we do have those ugly z issues.
You usually shouldn’t have too many z-buffer fighting problems, you don’t sound like a newbie (and i am one) but one thing i’ve noticed is, do NOT put the near clip plane to 0, i think the usual value people suggest is 1, although i guess it depends alot upon the scale of your scene.
Our engine is a glide based one. I do the gl conversion. It is almost ready except the Z buffer bugs.
(It is an outdoor, heightfiled based terrain engie)
What I found interesting is that the near clipping plane is in 1, and the far plane is between 700 and 4000 according to the player’s maximum distance setting.
I have solved this bug once, but I can not remeber what I was doing. -(shame)