extension troubles - possibly simple

Hello. I’m using the GL_EXT_TEXTURE3D extension on my development computer and it works wonderfully. The problem I am having is bringing it over onto ANY other computer. When I copy the program over to another computer to test/view it on that one, the texture comes out as a white box. (Well, technically, since im slicing it it comes out as a bunch of white quads.) The thing is, I know for a fact that the computers I try it on support the texture; they dont output the error that says that the extension isnt supported and when it calls glTexImage3DEXT the function doesnt kick them in the side with an invalid function pointer.

I’ve got an nVidia 6800 GT, the computers that I have tested the program on use nVidia 5900 Ultra, 5700 OC, ATI Radeon 9800 Pro, and a few older cards. The programming environment I am using is Visual Studio.NET 2002 (yes 2002) in Windows 2000. The operating systems we have tried on other computers range from Windows 2000 to Windows Server 2003 and Windows XP Pro.

Is there a dll missing? I have the Microsoft PSDK and nVidia SDK installed on my computer, is there something that I have that the other computers don’t?

Thanks a TON in advance.

It seems like texture image simply doesn’t loaded, check first that image present and loaded from disk into memory corectly.

See, that is part of the problem. I store and keep the 3d texture in memory so that I can look at 2D slices cut through it also. So the view is the 3D block (which on my computer works correctly) and 2D slices of what is in memory… (the 2d portion still works on all of the other computers, only the 3d part fails)

Everytime i want to view a different part of the 3D texture’s cross section, i reallocate a 2D texture given the index in the 3D array. The texture is loaded into RAM, but how do i check to see if it is loaded inside Video RAM?

theoretically 512x512x120x4 GLuints should fit into 128 megs of onboard VRAM, and if not it should cache into the system ram, correct?

(the texture is 512x512x120 with 4 bytes RGBA color values)

Thanks for the assistance.

I think this is a point of your problem - ARB_texture_non_power_of_two : Conventional OpenGL texturing is limited to images with power-of-two dimensions and an optional 1-texel border. ARB_texture_non_power_of_two extension relaxes the size restrictions for the 1D, 2D, cube map, and 3D texture targets. But it seems like none of your cards except nv6800 have that extension and you use 512x512x120 texture.