SDL_Image + PNG + Alpha + OGL = Argh!

Hi there! I’m rather new to GL, so please don’t flame me when I ask something that has been asked before (although I searched the whole place here :wink: )

Okay - now my question :slight_smile:

I’m using SDL & SDL Image for all initialization and textureloading, the problem is, that GL seems to ignore the alpha channel of any loaded textures - I’m usually using PNGs with Alphachannel. The Image is usually displayed correctly except the transparency (even with GL_BLEND enabled)

Here’s my code for loading the Image:

        SDL_Surface *Tmp, *conv;
        Tmp = IMG_Load(File);  //load to temporay surface
        
         conv = SDL_CreateRGBSurface(SDL_SWSURFACE, Tmp->w, Tmp->h, 32,
          #if SDL_BYTEORDER == SDL_BIG_ENDIAN
        0xff000000, 0x00ff0000, 0x0000ff00, 0x000000ff);
        #else
        0x000000ff, 0x0000ff00, 0x00ff0000, 0xff000000);
        #endif
         //do some conversion of the byteorder for gl
         
        SDL_BlitSurface(Tmp, 0, conv, 0);
        
        TextureNode *myNode;
        myNode = new TextureNode;              //allocate memory for the texture  :) 

        myNode->TX = conv->w;                  //store file dimensions
        myNode->TY = conv->h;
        
        glGenTextures(1, &myNode->Data);
        glBindTexture(GL_TEXTURE_2D,  myNode->Data);
        
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
        glPixelStorei(GL_UNPACK_ROW_LENGTH, conv->pitch / conv->format->BytesPerPixel);

        glTexImage2D(GL_TEXTURE_2D, 0, 3, conv->w, conv->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, conv->pixels);

        glPixelStorei(GL_UNPACK_ROW_LENGTH, 0);
        

        SDL_FreeSurface(Tmp);                  //delete temp mem.
        SDL_FreeSurface(conv);

I guess the problem is somewhere in this line:
glTexImage2D(GL_TEXTURE_2D, 0, 3, conv->w, conv->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, conv->pixels);

As far as I understood the glTexImage2d stuff, I need to specify the format of the Pixeldata for the texture (with 3 beeing RGB for example - the problem is, that when I put 4 (instead of 3) the image isn’t displayed at all. Am I missing some texture env’s?

I use the following setup before drawing:

glDepthFunc(GL_LESS);
glEnable(GL_DEPTH_TEST);
glEnable(GL_TEXTURE_2D);
glTexEnvf(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_MODULATE);

glShadeModel(GL_SMOOTH);
glMatrixMode(GL_PROJECTION);
glMatrixMode(GL_MODELVIEW);

This happens just before drawing:
glEnable(GL_BLEND); glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);
glDisable(GL_DEPTH_TEST);
glEnable(GL_ALPHA_TEST);
glAlphaFunc(GL_GREATER,0.0f);

and yet it doesn’t work :frowning:

Any help would be appreciated :slight_smile:

Cheers Phil

The third parameter of glTexImage2D is the internal format. There should be an enumeration, propably GL_RGBA in your case, or one of its variations.

The use of 3 or 4 is only for backwards compatibility, and it should not be used in new applications. Either way, 3 is wrong because your texture has 4 components…

strange enough - if I put 4 or GL_RGBA there’s no image.

Are you sure your texture is a power of 2?
Also, check the data type for the conv->pixels member. Is it unsigned char? Usually problems like this came to me and had to do with the data type.

What if you set 4 as argument for gkTextImage instead of 3 and disable alpha test and blending? Your image will be still unvisible or it will not show up the transparency?

got it to work. after all it had something to do with the messing I did with the byteorder in the beginning.