Displaying a transparent .tga texture

Hi to everyone! My latest problem is this:
I’ve made a texture in gimp that includes transparency in some areas. Looking at the file properties (I’m using win2k) it reports a color depth of 96 bits. What are the proper settings for
glTexImage2D so I can display my texture properly? Before I added the transparency it displayed ok. After adding it the program loads ok, but no texture shows. My current try with the function is:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, g_tgaFile[0]->imageWidth,
				 g_tgaFile[0]->imageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, g_tgaFile[0]->imageData);

but nothing shows (except my untextured quad ofcourse)

Hi !

This depends on how you load the image into ram ?, OpenGL itself has no support for image loading so you must have some code yourself that takes care of that, and it all depends on how that code saves the loaded image in memory, if it is 4 pytes per pixel (red, green, blue and alpha value) your example should work, if not… so it all depends on how the image is loaded into ram.


I’ve managed to load the .tga file created from gimp. In gimp the white area looks transparent but not in my program. You can see a screenshot here.

I just don’t want the white border around the finger to show. That’s all! I’ve tried various combinations of blending and alpha testing but none of them worked.
Also just checked the size of the image that is read from fread and gives a number that is a the result of 128x256x4 (128x256 is the size of the texture). So I must be reading in the alpha component correctly.

Enable blending, use blending function GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA and it should be okay if alpha is set properly.

Almost there. I’ve gotten rid of the border but now the image color is all black. This is what I mean:

This is how the image looks in photoshop cs:

and this is my source code from the init function:

glClearColor(1.0, 1.0, 1.0, 1.0);
g_tgaFile = new TGAFILE;
LoadTGAFile("pp.tga", g_tgaFile);
glGenTextures(1, &texture);                  // generate texture object
glBindTexture(GL_TEXTURE_2D, texture);       // enable our texture object
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, g_tgaFile->imageWidth,
				 g_tgaFile->imageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, g_tgaFile->imageData);

and in my display function:

glColor3f(0.0, 1.0, 0.5);
glRecti(-560, -450, -200, -300);
glColor3f(0.0, 0.0, 0.0);
glRecti(200, 0, 400, 100);

glTranslatef(0.0, 0.0, 1.0);
glBindTexture(GL_TEXTURE_2D, texture);
glTexCoord2f(0.0, 1.0);  glVertex2f(g_fFingerX, g_fFingerY);
glTexCoord2f(0.0, 0.0);  glVertex2f(g_fFingerX, g_fFingerY - g_fFingerSize);
glTexCoord2f(1.0, 0.0);  glVertex2f(g_fFingerX + (g_fFingerSize * 2.0 / 4.0), g_fFingerY - g_fFingerSize);
glTexCoord2f(1.0, 1.0);  glVertex2f(g_fFingerX + (g_fFingerSize * 2.0 / 4.0), g_fFingerY);

Any ideas? I’m out of luck.

You can change texenv (see doc for glTexEnv):
“GL_TEXTURE_ENV_MODE defaults to GL_MODULATE and GL_TEXTURE_ENV_COLOR defaults to (0, 0, 0, 0).”

Modulate multiplies the vertex defined color and lighting by the texture. Replace only take the texture in account. If you don’t need lighting, it is the preferred choice.

Have fun !

I really didn’t get the last post. But my problem now is that the texture is not loading. I mean the code that actually loads the file is ok. I can see the proper dimensions and color depth (using watches) but the texture is just not showing!! It can read the alpha information obviously as you can see in the last post but the image is just not showing. I’m about to go beserk!

I’ve finally found the problem! Because the color state before drawing the textured quad was black (0.0, 0.0, 0.0) it probably multiplied the texture as well and turned it to black (through the blending function). By changing the color to white (or any other bright color I think) I finally got the result I wanted! Wow this thing pissed me off for almost 2 days. Thanks to everyone for their help!