OK, DescribePixelFormat gives me some formats that are 3D accelerated, and that use 16-bit color. However, as I try to create a rendering context that uses ANY of these, it fails (invalid parameter). Apparently my 3D card only supports 32-bit color. Why the hell did DescribePixelFormat give me these dumb formats, anyway? Or is it something else that’s being dumb?
Looking at thes “16-bit” pixel formats, I notices this:
CRedBits = 8
CGreenBits = 8
CBlueBits = 8
CRedShift = 16
CGreenShift = 8
CBlueShift = 0
Doesn’t that look like 24-bit color to you? Oddly enough, cColorBits is 16. Perhaps I don’t quite understand what cColorBits and the other color variables are. Can someone explain that to me? The help files (and the book I have) have completely incomprehensible explanations for them.
Originally posted by Serge K: FireGL1/2/3
(well, actually these cards can work in 32bit only)
I have found this kind of problem under Borland C++ Builder , and I realized that the system not loaded the opengl32.dll well. I don’t Know if it’s that but try this way before doing something else…