Texture Problem

I am having a problem displaying my texture using a quad. If I draw it using glDrawPixels, its ok. I would rather use a quad since it is more flexible. The pixel data is stored in an array as unsigned chars.

Below is the code snippets. Can anyone tell me what I am doing wrong? Thanks!

// generate a 2d texture image from an arary
pBitmap->w = (int)array[0]; // first 2 entries hold width & height.
pBitmap->h = (int)array[1];
pBitmap->pixels = &array[2];
glGenTextures( 1, &(pBitmap->texId) );
glBindTexture(GL_TEXTURE_2D, pBitmap->texId );
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D( GL_TEXTURE_2D, 0, GL_RGB, pBitmap->w, pBitmap->h, 0, GL_RGB, GL_UNSIGNED_BYTE, pBitmap->pixels );

// draw a texture (2d) x axis acoss, y axis down

#if 1

glRasterPos2f( (float)x+.5, (float)y + (float)pBitmap->h + .5);
glPixelStorei( GL_UNPACK_ALIGNMENT, 1 );
glDrawPixels( pBitmap->w, pBitmap->h, GL_RGB, GL_UNSIGNED_BYTE, pBitmap->pixels );

#else

glEnable( GL_TEXTURE_2D );
glBindTexture( GL_TEXTURE_2D, pBitmap->texId );

glColor4f( 1.0f, 1.0f, 1.0f, 1.0f );
glBegin( GL_QUADS);

// Bottom Left
glTexCoord2f( 0, 0 );
glVertex2i( x, y );
// Top left
glTexCoord2f( 0, 1 );
glVertex2i( x, y+h );
// Top right
glTexCoord2f( 1, 1 );
glVertex2i( x+w, y+h );
// Bottom right
glTexCoord2f( 1 , 0 );
glVertex2i( x+w, y );
glEnd();
glDisable( GL_TEXTURE_2D );

#endif

Can you see the Quad ok?

yes. I get a white quad. At first, i thought it was back face culling. But, I turned back face culling on & the quad is still there. Then, I thought it was the vertex order, so I tried changing it around. Nothing seems to get it to work…
I assume it is something I am doing wrong in the texture setup phase.

Does your texture have dimensions that are a power of 2? ie. 32x32, 64x32, 128x128, etc…

If not, then that is your problem.

The textures are powers of 2. ( 128x128 & 64x64 )
All the gl commands are not returning any errors.
The texture ID’s seem to be ok.
Are there some glEnable or disable commands I can try turning on/off that might help determine what is going on?

Have you tried using build2dmipmaps to create your trxture?

Are you creating the textures after you’ve initialized the window to use OpenGL?

What are the physical sizes of your textures?

Im not sure what you are doing wrong, but the above code would be much much simpler using the library “devIL” you can find it at www.imagelib.org its opengl syntax and very intuitive.

Here is an example of loading an image into OpenGL
ILuint Image;

ilGenImages(1, &Image);
ilBindImage(Image);
ilLoadImage(“WhateverIwant.tga”);
ilutGLBindTexImage();
ilDeleteImages(1, &Image);
glBindTexture (GL_TEXTURE_2D, Image);

and bang your done. Now everything is bound, gaged, and set for opengl. That easy. you can also build MipMaps this way, with a call to ilBindMipmaps(image); instead of ilBindImage, and boom your mipmaps are done. I would recomend this way. But as I stated before, no idea what your current problem is.

Pops - Nope. How would this help?
Deiussum - Yes. There are no reported opengl errors.
Jeffry - 128x128 & 64x64.

I am going to set this on the back burner for now & come back to it in a couple of days once I finished fleshing out some other sections of code.

Thanks for the help.

You may want to make sure the texture matrix is the identity, and that the clamping modes are set to clamp (or try wrap too).

Maybe you have waxx0r wrapping/env settings. Add these right after your other glTexParameter calls:

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);

And then add this somewhere (preferably during initialization) once just for convenience’s sake–if you want modulation later add it in but for now just to get it going:

glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);

If there was one error I see more often it’s people calling this with a first argument of GL_TEXTURE_2D. NO! The only valid argument here is GL_TEXTURE_ENV. Make sure you don’t have that, it’ll shoot you in the foot from the getgo.