open gl texturing question

to make myself clearer…i call BMP.bmBits when i want to access the bitmap data, not a reference to hBMP which i have deleted…

Originally posted by Mr.JT:
Relic: You said i delete hBMP to early…but i dont use that pointer after I’ve deleted it, as GetObject transfers the bitmap data to the bitmap structure BMP, and subsequently I use this to access the bitmap data. Surely this is ok as all the data has been transfered to the BMP structure and then i delete hBMP?

Actually, I think I said that, and yes it can still cause problems. BMP.bmBits is just a pointer to memory. If you tell Windows to delete that memory, yes the image might be there, BUT if you allocate memory after you delete the bitmap, you could accidently overwrite your texture. This may not cause a problem immediately either, because I think OpenGL will copy texture data to the video card, but if something happens, and the video memory gets wiped (say with an Alt-Tab or a screen saver kicks in), OpenGL will go back to find that image and, surprise surprise, will pull up the garbage data. It’s usually a good idea to keep the textures in memory until you’re sure they’re not needed anymore (or at least for a while), and reload them if they become necessary.

[This message has been edited by Nychold (edited 02-13-2004).]

i understand…have changed it… but still not working. Time to play around with different GL_BGR/GL_RGB etc! Thanks to everyone so far with the help…sorry its turned out to be such a long one…but no doubt it’ll get longer!

Just to throw in another choice to do things, my choice of accessing bitmap bits is GetDIBits() which actually copies the bits to your desired memory location.
Hey this is all basic GDI stuff, go grab some working code from somewhere and just plug it in.
I don’t use BMP files normally, I load uncompressed TGA files which are so simple to handle or use a PNG-lib if images start to get big. Where the heck did I put this 16k*16k terrain texture…

u have a loadidentity just before swapping the buffers. might cause some probs.

Originally posted by mithun_daa:
u have a loadidentity just before swapping the buffers. might cause some probs.


I cant find GL_BGR as an option for glTexImage2D?? It is an undeclared identifier when compiling? How do I define that?

its an extension, download glext.h and use GL_BGR_EXT, or just try GL_RGB it may look weird (ie blue and red reversed) but at least you’ll know your texturing is working

ok well none of these things are working…and the texture seems to be a mixture of noise and warped windows icons from the desktop…and its different each time i run the program…could this be an error in me creating the resource file for visual studio? Although I exactly followed nehe tutorial…may be i’ll try loading the picture directly from file.

The bmBits pointer is invalid after you have deleted the bitmap object.Remember,it is not YOUR application that supplies a pointer to which the bitmap data will be copied,but GDI allocates the pointer and frees memory when you destroy the bitmap object.
The solution is very simple:Why do you keep your glTexImage2D inside the drawbackground procedure?The texture need to be loaded only once.
glTexImage2D must be placed immediately after glBindTexture,where bmBits is still valid and the bitmap can be loaded into video memory.After calling glTexImage2D,you can delete the bitmap object.
Also,the first instance of your code(the one that had gluBuildMipmaps) had a call like glBindTexture(GL_TEXTURE_2D,background.tex).You initialize background.tex to 0 and then you put it inside glBindTexture without assigning to it any value at all!What did you expect to happen?
And use GL_BGR or GL_RGB.

[This message has been edited by mikeman (edited 02-15-2004).]

[This message has been edited by mikeman (edited 02-15-2004).]

If I dont sort it this time i promise to stop posts!! I’ve changed it so many times and read so many different styles of achieving it that i think I have probably just made it worse. I still get a weird jumble of rubbish which definately isnt my texture, and the most obvious indication that something is wrong is the line in drawBackground() cout << BMP.bmWidth << endl; returns the value 0 (zero). so nothing is actually in the structure BMP. And I have no idea why. Could be that I’m just getting dillusional but the following code should work now shouldnt it…?

#include <gl\glut.h>
#include “resource.h”

class Graphics{
struct object // Create A Structure Called Object
int tex; // Integer Used To Select Texture
float x; // X Position
float y; // Y Position
float z; // Z Position

GLuint texture[1]; // Storage For 1 Texture
object background;			//create object 'background' that will hold background BMP

HBITMAP hBMP; // Handle Of The Bitmap
BITMAP BMP; // Bitmap Structure

Graphics(){}; //default - currently does nothing
Graphics(int argc,char* argv[]); //overloaded constructor
~Graphics() { DeleteObject(hBMP);} // Delete The Bitmap Object //destructor - currently does nothing

void loadBackgroundTexture(void);
void drawBackground(void);


Graphics::Graphics(int argc,char* argv[]){
glutInit(&argc,argv); //start glut
glutInitDisplayMode(GLUT_DOUBLE|GLUT_RGB); //double buffered for animation. RGB colour mode
glutInitWindowSize(300,300); //window size
glutInitWindowPosition(300,150); //window position
glutCreateWindow(“PicassoTalk! J.Trevan : 2k4”); //create window
glClearColor(0.0f ,0.0f ,0.0f ,0.0f); //clear colour = black
glMatrixMode(GL_PROJECTION); //select projection mode
glLoadIdentity(); //reset projection view
gluPerspective(45.0f,(GLfloat)300/(GLfloat)300,0.1f,100.0f); //perspective
glMatrixMode(GL_MODELVIEW); // Select The Modelview Matrix
glLoadIdentity(); // Reset The Modelview Matrix
glClearDepth (1.0f); // Depth Buffer Setup
glDepthFunc (GL_LEQUAL); // The Type Of Depth Testing (Less Or Equal)
glDisable(GL_DEPTH_TEST); // Disable Depth Testing
glShadeModel (GL_SMOOTH); // Select Smooth Shading
glHint (GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST); // Set Perspective Calculations To Most Accurate

void Graphics::loadBackgroundTexture(void){

byte Texture[]={ IDB_BACKGROUND2 };			// The ID of background bitmap to load

//The following loads the bitmap image. MAKEINTRESOURCE converts int to resource number
//ie. which bitmap to load. IMAGE_BITMAP defines that image being loaded is a bitmap. The
//next 2 zero's are the required size of the image, x and y. Want to use the default image
//size so leave as zero. The last parameter (LR_CREATEDIBSECTION) ???!!!!
//hBMP  now points to the bitmap data that is loaded by LoadImage( ).

if (hBMP) { // Does The Bitmap Exist?
	GetObject(hBMP,sizeof(BMP), &BMP); // Get The Object
	// hBMP: Handle To Graphics Object
	// sizeof(BMP): Size Of Buffer For Object Information
	// Buffer For Object Information
	glPixelStorei(GL_UNPACK_ALIGNMENT,4); // Pixel Storage Mode (Word Alignment / 4 Bytes)
	glBindTexture(GL_TEXTURE_2D, texture[0]);
	glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, BMP.bmWidth, BMP.bmHeight,


	cerr &lt;&lt; "hBMP doesn't point to anything!! Error loading background bitmap!" &lt;&lt; endl;


void Graphics::drawBackground(void){
glLoadIdentity(); //Reset modelview matrix
glTranslatef(0.0f,0.0f,-3.0f); //into screen
cout << BMP.bmWidth << endl;
glTexCoord2f(0.0,0.0); glVertex3f(-1.0,-1.0,0.0);
glTexCoord2f(0.0,1.0); glVertex3f(-1.0, 1.0,0.0);
glTexCoord2f(1.0,1.0); glVertex3f( 1.0, 1.0,0.0);
glTexCoord2f(1.0,0.0); glVertex3f( 1.0,-1.0,0.0);



also what is glGenTextures(1,texture); doing? It doesnt seem as though this is in the right place/is actually doing anything…what is actually put in texture[0]?

void glGenTextures(GLsizei n,GLuint *textureNames);

Returns n currently unused names for texture objects in the array textureNames. The names returned in textureNames do not have to be a contiguous set of integers.

The names in textureNames are marked as used, but they acquire texture state and dimensionality (1D or 2D) only when they are first bound.

Zero is a reserved texture name and is never returned as a texture name by glGenTextures()

This is a direct quote from the OpenGL reference book. Basically, glGenTextures() allocates the texture objects from OpenGL for you to use. It’s almost required if you use multiple textures, but I make it habit to use it even for one texture. :slight_smile:

No, it’s not required unless you have software you didn’t write that also uses texture handles in the same context that you never created.

It doesn’t actually allocate anything, it merely spits out and reserves texture names (integers) that have not been used before.

The business of using gentextures can be dispensed with entirely. Just using unique non zero integers for each texture works just fine. GenTextures is a way of obtaining unique names that isn’t essential at all.


Found the answer! It was to do with the line hBMP=(HBITMAP)LoadImage(GetModuleHandle(NULL),MAKEINTRESOURCE(Texture[0]), IMAGE_BITMAP, 0,
0, LR_DEFAULTCOLOR); When reading through tutorials I read that the last parameter which was originally LR_CREATEDIBSECTION meant that bitmap information was stored without the colour information (nehe lesson 38. I obviously wanted colour in my bitmap so changed this to LR_DEFAULTCOLOR as this was detailed in the manual as using standard colours. I just got confused between the meaning of these parameters, and although i still dont actually understand the meaning of LR_CREATEBIBSECTION it doesnt matter as it works!! Once id done this i had to switch the colour modes round to BGR as someone mentioned before. This link dies here!!!