This is my first post, but here goes:
I’m having a problem with 16-bit textures, arranged in the 5551 RGBA bit format. I have used a 24-bit version of the texture, a simple RAW file, and all worked well. However, when I coverted the file to a 16-bit texture, I could not seem to get OpenGL to render it. I know the 16-bit texture is fine, the filesize checks. Here’s what I’m doing:
void* LoadRAW(char* pTexture, int nNumber, int nDepth)
{
FILE* pFile = fopen(pTexture, “br”);
char* pData = new char[1281282];
fread(pData, sizeof(char), 1281282, pFile);
fclose(pFile);
glGenTextures(1, &texture[nNumber]);
glBindTexture(GL_TEXTURE_2D, texture[nNumber]);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB5_A1, 128, 128, 0, GL_RGB5_A1, GL_UNSIGNED_BYTE, pData);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
return pData;
}
Forget the return value, isn’t used, this code is not clean, but you have the idea. I believe the problem deals with the params used in glTexImage2D, but am not sure. My screen res is set to 16-bit, so that is not the problem. If anyone can help…please.