I’m going crazy with my alpha texture problem. I’m loading a picture that
has alpha channel, then create a texture from it. Here’s the code for
creating the texture:
glGenTextures(1, &textureptr); glBindTexture(GL_TEXTURE_2D, textureptr); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexImage2D(GL_TEXTURE_2D, 0, 4, w, h, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);
This is the result I get:
(btw, if the link doesn’t work, try writing it manually)
The texture on the left is created with above code, the texture on the
right is the same texture, but created without alpha:
glTexImage2D(GL_TEXTURE_2D, 0, 3, w, h, 0, GL_RGB, GL_UNSIGNED_BYTE, data);
See the huge difference? I wonder what I’m doing wrong. I’ve tried
different image loaders, written by other people and all give the same
result, so it’s not my image loader to blame for this. I did a search on
alpha textures on this forum and read tons of threads, but none of them
provided a solution to my problem.
I’m using orthographic mode because I’m working on a 2D project.
Can anyone help? Any ideas what I’m doing wrong?