Problems with displaying 2D Images (using DevIL)

Hello, I am currently working on a game, a Pacman remake.

So before I thought about using DevIL I used an Bitmap Loader that I found in the internet, the problem with it was though that it could only load .bmp and since I wanted small textures with transparency I decided to look for an image loader that could support multiple image formats and is easy to implement/use.

Now I already found some examples how to use the basic functions of DevIL and I think I have done everything right but somehow OpenGL cant read the data it gets or I dont know but he shows me only a white screen where he should load the textures.

So I decided to use the advanced libaries of DevIL where he directly loads the given image as an OpenGL texture and there he could display it. But it was somewhat blurry like he did something wrong when he converted the texture data (I would love to post a screenshot but my mobile internet doesnt allow me to go on hosting sites)

Anyway, here is the code where I initalize OpenGL:

int windowgl::InitGL(GLvoid)					//All Setup For OpenGL Goes Here
{
	if (!tex->loadBitmaps()) { return FALSE; }	// all Bitmaps are loaded and go into a list
	glEnable(GL_TEXTURE_2D);		//Enable Texture Mapping
	glColorMaterial(GL_FRONT, GL_AMBIENT_AND_DIFFUSE);
	glEnable(GL_COLOR_MATERIAL);
	glShadeModel(GL_SMOOTH);		//Enable Smooth Shading
	glClearColor(0.0f, 0.0f, 0.0f, 0.0f);	//Black Background

	glClearDepth(1.0f);			//Depth Buffer Setup
	glEnable(GL_DEPTH_TEST);	//Enables Depth Testing
	glDepthFunc(GL_LEQUAL);		//The Type Of Depth Test To Do

	glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);		//Really Nice Perspective Calculations
	return TRUE;	//Initialisation Went OK
} 

here I draw the field where the texture should be displayed:

int level::DrawGLScene(GLvoid)			//Here's Where We Do All The Drawing
{
	fbreite = 25.0f * M_X;
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);		//Clear The Screen And The Depth Buffer
	glLoadIdentity();										//Reset The Current Modelview Matrix
	
	//Feld zeichnen mit Bildtexturen
	for (int i = 0; i < M_Y; i++)
	{
		for (int j = 0; j < M_X; j++)
		{
			// Holt sich die derzeit benötigte Bitmap des Levels
			glBindTexture(GL_TEXTURE_2D, tex->huii("image.bmp")); //I simplified this to loading only one image because in the game it done to complicated to explain (but the result should be the same)
			// Zeichnet ein 25x25 Pixel großes Quadrat mit Textur
			glBegin(GL_QUADS);
				glTexCoord2f(0.0f, 0.0f);
				glVertex3f(0.0f, 25.0f, 0.0f);
				glTexCoord2f(1.0f, 0.0f);
				glVertex3f(25.0f, 25.0f, 0.0f);
				glTexCoord2f(1.0f, 1.0f);
				glVertex3f(25.0f, 0.0f, 0.0f);
				glTexCoord2f(0.0f, 1.0f);
				glVertex3f(0.0f, 0.0f, 0.0f);
			glEnd();
			glTranslatef( 25.0f, 0.0f, 0.0f);		// Setzt den "Ursprung" 25 Pixel in x Richtung weiter, wo das nächste Quadrat gezeichnet wird
		}
		glTranslatef(-fbreite, 25.0f, 0.0f);		// Geht mit dem "ursprung" die gesamte Pixelbreite des Levels zurück und 25 Pixel in y Richtung um in der nächsten Zeile zu beginnen zu zeichnen
	}
} 

and here I try to load the image (the exhausting waywhere he doesnt display anything):

GLuint textures::huii(char* fname)
{
	char path[30] = ".\\bitmaps\\";
	strcat_s(path, 30, fname);				//contains now the path to the bitmap
	
	ILuint image, width, height;
	GLuint imgid;
	ilInit(); // Initialization of DevIL
	
	ilGenImages(1, &image);
	ilBindImage(image);
	ILboolean loaded = ilLoadImage(path);
	if (loaded == IL_FALSE) return -1;
	loaded = ilConvertImage(IL_RGBA, IL_UNSIGNED_BYTE); //Convert every colour component into unsigned byte. If your image contains alpha channel you can replace IL_RGB with IL_RGBA 

	width = ilGetInteger(IL_IMAGE_WIDTH); // getting image width
	height = ilGetInteger(IL_IMAGE_HEIGHT); // and height
	
	glGenTextures(1, &imgid);
	glBindTexture(GL_TEXTURE_2D, imgid);
	
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
	glTexImage2D(GL_TEXTURE_2D, 0, ilGetInteger(IL_IMAGE_BPP), width, height, 0, ilGetInteger(IL_IMAGE_FORMAT), ilGetInteger(IL_IMAGE_TYPE), ilGetData());
	ilDeleteImages(1, &image);
	return imgid;
} 

(and the easy way where he displays the images like he had some data loss during the conversion):

GLuint textures::huii(char* fname)
{
	char path[30] = ".\\bitmaps\\";
	strcat_s(path, 30, fname);				//contains now the path to the bitmap
	
	ilInit();
	iluInit();
	ilutRenderer(ILUT_OPENGL);
	return ilutGLLoadImage(path);
} 

I hope you can help me solve the problem :wink:

What does
ilGetInteger(IL_IMAGE_BPP)
ilGetInteger(IL_IMAGE_FORMAT)
ilGetInteger(IL_IMAGE_TYPE)

it should be
GL_RGB or GL_RGB8 or GL_RGBA or GL_RGBA8

Format can be GL_BGRA or GL_RGBA and a few others.

Type can be GL_UNSIGNED_BYTE.

Don’t forget to validate with glGetError()

these functions return the type of the image. and since I convert every image first with the line

loaded = ilConvertImage(IL_RGBA, IL_UNSIGNED_BYTE);

, every one returns the following values:

GLint t;
t = ilGetInteger(IL_IMAGE_BPP); //returns an interger value of 4
t = ilGetInteger(IL_IMAGE_FORMAT); //returns an integer value of 6408
t = ilGetInteger(IL_IMAGE_TYPE); //returns an ingeter value of 5121

these values are equal to the ones I have to put in the function to make it work

t = GL_RGBA; //6408
t = GL_UNSIGNED_BYTE; //5121 

so the parameters should be valid.

how do I validate it with glGetError()? when he is done with the function glTexImage2D(…), do I have to check with the glGetError() function if there are any errors or what?

Try this:


glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, ilGetData());

And don’t init IL each time you load a new texture.

still doesnt work…

and the function glGetError() returns 1281 after glTexImage2D(…). what does that mean?

edit: I got it working now :smiley:
I didnt know that there is a “power of two” rule. my images all have a resolution of 25x25 so after resizing the texture data with the DevIL function

iluScale(64, 64, ilGetInteger(IL_IMAGE_DEPTH));

, OpenGL could load the texture properly and I can display them (the old texture loader already did that, I just overlooked it).

However I am not sure to what resolution I should resize the texture data? As I said, all my images are currently .bmp files that have a resolution of 25x5 and I bind them on GL_QUADS that also have a size of 25x25.
therefore I tried it with 32x32 but strangely enough he shows the images in a worse quality than they actually are.

then I tried it with a 64x64 resolution and the quality was a lot better. so what would be the best choice now?

Indeed, “non-power of two” texture must be explicitly supported by your hardware, otherwise it will not work.

Resampling a texture is tricky, one way to deal with missing hardware support is to allocate 32x32 textures, but only with the lower left 25x25 defined, and use texcoords in range 0. to 25.0/32.0 instead of 0.0 to 1.0

Ah okay. But I think since it works just fine with the scaling function of DevIL, I’ll stick with it :wink:

However another problem has occured. I said before that I wanted to display images with transparency on them (png files). But I suppose OpenGL doesnt display the alpha value that it gets from the texture data, at least not without enabling it.

I have done a quick search and also took a look into NeHe’s OpenGL tutorial and found out that you have to use blending to make it work. But why does NeHe turn off the depth buffer when enabling GL_BLEND?
For me it makes no difference (or at least I cant see any) if the depth buffer is enabled or not so what is here recommended?

The depth buffer works as intended only when blending is disabled.
When blending happens, depth writes should be disabled, whereas depth test can stay enabled.

Think about it : if you draw transluscent polygon B, between already drawn polygon A on the background and transluscent polygon C on the foreground, how do you expect this to work ? The solution is to draw all the blended stuff last, in back to front (in effect not needing to update the depth buffer).

If you don’t see a difference it means your scene is not complex enough or you hit the exactly correctly drawing order. Try to turn the camera on the other side of the working scene to verify.

Well in my game there is not really any depth so I guess I dont need the Depth Buffer anyway, or? I just had it enabled because I was first going to follow NeHe’s Tutorial but then at the Texture tutorial I decided to make a game.

Unlike in the tutorial however, I display everything only as 2D with the function

gluOrtho2D(0, width, height, 0);

So there is only one big rectangle that consist of little quads and I move some of them around. Therefore I dont think I need a Depth Buffer at all, or?

Thank you to share! I use this for really big!Is how to do it?

uuhm I only posted the code so others could help me better. Besides that you could find this in any OpenGL tutorial out there, just search for “binding image OpenGL” or “display image OpenGL”.
If you need a method to load your pictures in the memory to display them, simply search for an Image Loader that supports your file format.

DevIL is good because its simple to implement and to use (and it supports a lot of different graphic formats). If you want to know how to use it, just read the documentation of DevIL.
I can only try to help you if you have a specific problem but I cant explain you the whole “load graphics and display them” thing…