If somebody has experience, answer me please, how I can output images “normally” (with “dithering”) on display in 16-bit color depth mode (using OpenGL)?
It’s a example (in 32-bit and 16-bit…):
May be the solution in OpenGL:
> in initializing of pixel format (SetPixelFormat) ??
> or some dithering (but glEnable(GL_DITHER) has no results…)
Or solution in system GDI:
> maybe I need change palette (using API functions)?
Or something else???
Huh… While I waiting for answer I find the solution myself.
Now I choose right texture format when registering texture…
const
// I cut this constant from OpenGL 1.1 module
GL_RGBA4 = $8056;
begin
... blah-blah-blah ...
glTexImage2D(GL_TEXTURE_2D, GL_RGBA4, TextureSizeX, TextureSizeY,
0, GL_RGBA, GL_UNSIGNED_BYTE, TextureData);
... blah-blah-blah ...
end;
But I have one more question: This code works fine in OpenGL 1.1 and upper… But I need use OpenGL 1.0 functions ONLY! How I can choose RGBA texture format with 8 bits per component using OpenGL 1.0 functions (and constants too…) ???