16-bit textures

des this makes to to store textures in vidoe ras as 16 bit ?

cfg.r_texturebits = 16;

if (cfg.r_texturebits == 32) glTexImage2D(GL_TEXTURE_2D, level, GL_RGBA8, w, h, 0, GL_RGBA, GL_UNSIGNED_BYTE, image);
if (cfg.r_texturebits == 16)
{
glTexImage2D(GL_TEXTURE_2D, level, GL_RGBA4, w, h, 0, GL_RGBA, GL_UNSIGNED_BYTE, image);
}

the I mage so I saw no visual difference betwwen textures then is is 32 and 16 bit. Or I’m wrong.

Were you in a 16 bit color mode when testing each type of texture? Because it is likely, that your OpenGL driver is automatically downsampling 32 bit textures to RGBA4 while in a 16 bit color mode. Or the reverse could also be true. If you were in a 32 bit color mode, the 16 bit texture could be upsampled to 32 bits and stored as RGBA8. The internal format parameter as I understand it, is just a request. The driver can choose to ignore it. Someone correct me if I’m offline here, please.

yep. I found that this works, and some textures are looking worse (16-bit). But it’s was so trange that making 16 bit and 32 bit textures is so simple, too simple:]].

This is for DFrey, I recently installed an ATI expert (rage pro 128 16mb) and there is an option in the opengl tab on the driver to always convert 32bit textures to 16bit. There is also these commands - enable page flipping, disable dithering when alph blending, enable KTX buffer region extensions?, force 16bit Z-buffer, wait for vertical sync, subpixel precision(2/4) and a slider bar for fuzzyness vs sharpness. So it looks like several things in opengl can be driver driven(?).
Tim