I have a question, and I am afraid that it might be a little vague. I have written a simple program that loads in 7 textures and displays them as a floor and as the six faces of a cube. When I run the program on my computer, it runs fine, but when I put it on an older computer, I only get one of the textures to show up. I have the floor and the cube in separate display lists, and I call the floor first(which has the texture that is seem on all surfaces). The code for the floor and the cube uses glBindTexture() before rendering out each plane. Can anyone think of a reason why teh textures are getting messed up on the older computer? Thanx for any insight.
What happens when you draw the cube before the floor?
How old is the graphics card in the computer?
Originally posted by Pops:
[b]What happens when you draw the cube before the floor?
How old is the graphics card in the computer?[/b]
When I draw the cube before the floor, I still have the floor texture plastered on the cube. The old computer does not have a graphics card, just the motherobard, which I believe is a NVidia Riva 128/128ZX.
How about if you dont draw the floor, just the cube.
Are you sure you are binding the textures correctly?
Ignore that last bit because it works on the other computer.
I must be something to do with the onboard graphics device.
[This message has been edited by Pops (edited 03-30-2003).]
Are you exceeding the max texture size? Query this property from glgetintegerv umm and I thhink MAX_TEX_SIZE or something
aren’t you doing something what’s against gl specs… for example my nvidia gf2 binds textures also between glBegin and glEnd. you may check if you aren’t doing something similar
Originally posted by miko:
[Bfor example my nvidia gf2 binds textures also between glBegin and glEnd. [/b]
Really? Is it possible to bind textures between glBegin and glEnd? It sounds great but … I have gf2 (which means ge-force 2 as I guessed) but it doesn’t want to bind textures that way . Meaby my graphics card is a piece of junk or it’s just impolite ? I can’t believe it.
i don’t know… i tried it once and it worked (and after that i realized it’s forbidden so i repaired it. maybe it was under some certain circumstaces or what i never examined it. maybe some kind of driver bug? i don’t care now
Well, strange things happens sometimes.
I would say it is definatly the texture size. I had a similar experience with a 1024 X 512 texture and I changed it to 512 X 256 and it showed up fine… you have to watch those texture sizes on the lower end (or non-existant) cards…
I’m on gf2 also and I don’t bind my textures between begin/end pairs. Think of what would happen if you called drawarrays() that doesn’t use begin/end pairs. I also experienced your problem and here’s what I did to correct it. I glBindTexture() then set the texture environment functions for that bound texture, then bind different texture and also call texture environment functions again for that texture. Before I just called texture environment functions only once for all my textures which worked on 3dlabs card but not on nvidia gf2.
maybe you have not constructed the texture properly, and because of certain circumstances it works sometimes. I just had that. It also is something un-debuggable, as the things after glTexImage() happen inside the chip… at least that was my experience. don’t know why exactly at last i got it working. hmm maybe this does not really help you!?