Once-functional texture loader doesn't load

I wrote a simple texture loader for a simple OpenGL application I’m writing, and it worked great at one point, but now it doesn’t load anything. Can anyone help?

long LoadTexture(handle tex_handle, unsigned int res_index, unsigned int size, int components, int frames, float frame_time)
/*leave if…
*we do not have a valid texture *size is under 5 (32x32) (because it wouldn’t be too useful, and I don’t think any video
cards even support those small sizes) size is over 11 (2048x2048) (afaik no video card on the market can take anything
larger, and it would eat a good bunch of space in RGBA w/o compression)
if (tex_handle < 0 | | tex_handle >= MAX_TEXTURES | | size < 5 | | size > 11)
leave; //let’s get outta here (macro)

GTextures[tex_handle].data = malloc(pow(2,size)*pow(2,size)*components);
if(GTextures[tex_handle].data == NULL)

//This loads the texture from a resource
return false;

int channels=0;

switch(components) {
case 1: channels = GL_LUMINANCE; break;
case 2: channels = GL_LUMINANCE_ALPHA; break;
case 4: channels = GL_RGBA; break;
case 3:
default: channels = GL_RGB; break;

if(tex_handle < MAX_TEXTURES)
glGenTextures(1, &GTextures[tex_handle].GLtex_handle);
} else
return false;

glBindTexture(GL_TEXTURE_2D, GTextures[tex_handle].GLtex_handle);
gluBuild2DMipmaps(GL_TEXTURE_2D, components, pow(2,size), pow(2,size), channels, GL_UNSIGNED_BYTE, GTextures[tex_handle].data);
//glTexImage2D (GL_TEXTURE_2D, 0, GL_RGBA8, pow(2,size), pow(2,size), 0, channels, GL_UNSIGNED_BYTE, GTextures[tex_handle].data);

GTextures[tex_handle].frame_time = frame_time;
GTextures[tex_handle].frames = (frames==0) ? 1:frames ;
GTextures[tex_handle].size = size;

return true;

I can’t remember what it is that I did to break it because I did several things in there at once, and it just stopped working. I’ve checked the resource loader; it seems to work fine. I have all sorts of error handling code, and all returns good. Maybe I should check for OpenGL errors? (doh doh doh! :P)

C’mon, you guys! Some help you are! :stuck_out_tongue:

hope this is it:
your line:

glGenTextures(1, >extures[tex_handle].GLtex_handle);

should be
glGenTextures(1, GTextures[tex_handle].GLtex_handle);


Um, maybe it’s because you commented out your glTexImage2d() function call?


He uses gluBuild2DMipmaps right above the commented out glTexImage2D code.

Incidentally, textures sizes less than 32x32 are supported on all cards I’ve tried them on.


Uhh… that line reads just fine to me… if I did indeed write it that way, my compiler would barf. :stuck_out_tongue: I’ll also post this in the advanced forum and see if I can’t get more help there.