I use this pixeldescriptor:
static PIXELFORMATDESCRIPTOR pfd=
m_iDepthBits is set to 24
m_iStencilBits is set to 8
If i set m_iColorDepth to 32, it works fine.
If i set it to 16, i get the Microsoft renderer and my program crashes (but that could be something else).
Do Geforces (with 45.25 drivers) only support 32 Bit framebuffers?
If i remember correctly i once used a 16 Bit buffer, but that only worked without an alpha-channel. Since i request an alpha-channel, it does not work.
Is this normal?
You won’t get 16bit colour if you need destination alpha and/or stencil IIRC. With 24-bit colour and z you get nice 32bit alignment with alpha and z/stencil, which the memory controller probably likes.
So if i really want 16 Bit color, what other settings do i have to use then?
16 Bit color and 16 Bit Z and no stencil?
Is there any combination with alpha + stencil other than 32 Bit color / 24 depth / 8 stencil?
Jan, enumerate the pixel formats and choose one.
Take a look at ARB_pixel_format
i just must ask… why do you need 16bit? most people aim for better quality all the time
Of course i want good quality. But only if the pc is fast enough. Isn´t that why people use a 16 Bit color-buffer? Because their pc is not good enough? I just want another option to reduce the quality but to increase the speed. And in a fillrate-limited situation the color-depth can make a big difference.
V-man: I´ll do that, good idea.
I downloaded nVidias “NV Pixel Format 1.0”.
It says, that my Gf 4 supports only 2 16 Bit Color formats with alpha. However, those 2 formats use 16 Bit for RGB, but additionally 8 Bit for alpha. Plus, they are only for rendering to bitmaps (not to a window) and they are only rendered by software, so no hw acceleration.
Seems as if here is no 16 Bit color mode with alpha.