I’m working with MFC and have got a dialog box: in the “OnInitDialog” I load my texture up like this:
glTexImage2D(GL_TEXTURE_2D, 0, 3, TextureImage->sizeX, TextureImage->sizeY, 0, GL_RGB, GL_UNSIGNED_BYTE, TextureImage->data);
In the “OnRender”-Function I set up the texture:
Of course, for the triangles I define the Texcoords…
So now, this way nothing happen. My object appears untextured. But if I put the first (and expensive) loading-code in the render-function, the texturing works… Any ideas?
I do the same thing, but I have the glTexEnvf call in the initialization and it works fine. I believe you need to define the ENV_MODE (and COLOR) before you call the glTexImage2D function.
I put the env-mode as you said (With Color you mean the vertexcolor? Yes, I set that before), but nothing changed…
It seems as he would lose the texture somehow. With my win32-app at home it wrks fine, too. It seems as MFC causes this problem, but that’s also strange. I thought MFC doesn’t influence OpenGL. Any Guess?
Maybe this could be a hint, it seems strange to me, but I don’t know how to interprete this:
After calling “glGenTextures(1, &m_texture);”, the m_texture (I changed the name) contains “3452816845”. When I let running a working demo from nehe’s page, his texture-varaible contains “1”.
Does this mean, that he couldn’t find a place in memory to store my texture?
Theoretically, 3452816845 should be a perfectly valid texture ID. Only 0 is an invalid texture ID. But if you throw that number into a decimal-to-hexadecimal converter, you will see that the hexadecimal version of that number is 0xCDCDCDCD. Makes me believe that the variable is still unchanged (some compilers fill uninitialized variables with a bit pattern for easy recognition of uninitialized variables when debugging). This, in turn, makes me believe you are asking for a texture ID before you have a valid rendering context. Without context, OpenGL functions do nothing.
Yes, you’re right Bob. I loaded the “LoadTexture” too soon. Thanks for your help.