Memory texture

Originally posted by def:
[b]Try NeHe’s lesson 6. The texture loading code is resolution independant (just needs to be power of two) so all you need to do is exchange the bmp or resize it to test for different resolutions. [/b]
i can do a 4096x4096 texture with the demo even if i rebuild it!

any ideas why this won’t work with a realistic application? i’m not uploading what i would think would be a lot of textures. just maybe 6~10 512x512 textures at any one time i would guess. if i try to upload a 1024x1024 texture at any point you can bet it will come up all white.

what could be causing this given that i could run the nehe demo with 4096 textures even if i rebuilt it. as far as i can tell my context instantiation and texture upload process are identical to the nehe demo as well.



Originally posted by Relic:
[b]michagl, do other people’s programs show the same problem on your system?
If your reproducer is not complex, show the whole code.
Check glGetError after the glTexImage2D to see if it succeeded.
Newer drivers is always a good advice.

More involved checks:
AGP aperture size in the system BIOS.
glGetString(GL_VERSION) must contain AGP on an AGP board. If you have PCI there your motherboard chipset drivers are not correctly installed. (huge performance loss and texture space limited to video mem).[/b]
for the record, glGetError is not flagged after the failed texture is uploaded, and glGetString(GL_VERSION) returns “1.5.2” … no AGP PCI or anything… i’m assuming everything must be running in full AGP8x though. my machine runs other peoples demos at their reported frame rates on similar hardware.

Since you seem absolutely positive your OpenGL code is not the problem, check your texture loading code.
You could check your textures with glGetTexImage() to see if the problem occurs during rendering or before.

oiiii this one was a bugger…

i did a lot of testing and tracked the problem back to an old routine that checks whether or not a texture is power 2.

as it also happens there was a constant i was not aware of ‘MAX_TEXTURE_SIZE’ that it also checks against.

i can’t reamember where i originally picked up the beginnings of what would eventually become my texture management system some 4 years ago… but i’ve totally overhauled it in the mean time, and this little constant and this seemingly benign looking function which i presumed only tested for pow2 to caused the whole thing. turns out the only time i actually used the function was right before uploading a texture… there was no assertion break because i had modified the texture system to replace its error handling, and in haste i had just commented out the old error handling in that bit leaving only a return out of the function. i thought this was harmless and i would just get a white texture if somehow i had used a non power two… but like i said the function also rejected on the burried MAX_TEXTURE_SIZE constant… that being the only escape i had never bothererd tracing into the upload routine.

anyhow, as dissappointed with myself as i am for still having this old textbook code in my system due to neglect… i’m also very excited to be able to use comparably massive textures! so please don’t spoil my party with boo~hoos.



glGetString(GL_VERSION) returns “1.5.2” … no AGP PCI or anything…

Oops, my bad, it’s under GL_RENDERER.

(The single step debugger is your friend. :wink: )