RGB565 textures?

i was wondering, could somebody show me how to load a texture in RGB565 format. i would like to give my users to use 16 bit textures in case they have lower end systems.

i’ve tried using GL_UNSIGNED_SHORT_5_6_5, but have had no luck

Haven’t tried it myself:
If the original is a 24- or 32-bits image, try converting it into a 565 image. In both 24- and 32-bits, each intensity range from 0 to 255, but in 565, they are 0…31, 0…63 and 0…31 (shift by 3, 2 and 3).

take a look at this extension:

You don’t strictly have to load them in RGB565. OpenGL TexImage calls have two independent parameters:

internalformat: A hint on how the texture should be stored by the driver. A driver will generally choose the most appropriate match. For example, there are RGBA12 and RGBA16 internal format enums (12- and 16-bit components), but most hardware lacks this high-precision fixed-point support and will store such textures as RGBA8 instead.

format / type: Describes the data to load into the texture. This is NOT a hint – it describes the format of your image.

If you specify RGB/UNSIGNED_BYTE, that should be fairly well-optimized by most drivers. An exact format match may be better (and requires less memory for your source image). That would be RGB/UNSIGNED_SHORT_5_6_5 (or possibly BGR, but I think RGB is appropriate). How the pixels are stored are described in the EXT_packed_pixels extension spec and also in the 1.2 or better OpenGL spec (?), which incorporated and extended EXT_packed_pixels.

BTW – you can call GetTexLevelParameteriv to determine the actual number of bits in each texture component. There’s no query to determine the component order (is RED in the high or low bits).