Strange Texture Effect a Bug in OpenGL?

I’ve been playing around with creating a 3d game engine in OpenGL. I’m using some textures I pulled off the web, and stretched them down to 64X64 and am displaying them that way in the engine (just because the sample I learned from was using 64X64). It works fine, but the textures don’t look all the great, because of the distortion and data loss, so I want to increase the texture resolution.
My texture loading routine goes by two defines which set the width and height, and the code references a configuration file containing a raw data format I’m using for storing texture images.
All I change is the width from 64 to 512, the height from 64 to 512, and change the configuration file to reference the data in raw 512x512xRGBA format (the same pixel format I use for 64x64), and something extremely odd happens.
My planes start constantly switching textures. It seems to do it dynamically in no apparent pattern. If I don’t alter the view at all, the texture flickers back and forth between other textures that I’ve loaded up.
It works fine in 64x64. If I don’t move the view the picture stays constant as it should, but in 512x512, the textures are constantly flickering around, even splitting a single quad among multiple textures apparently in random places…
Now in places where it isn’t moving, it looks as it should, just like the source bmp. All of the textures are loading up, but they no longer map to the planes properly.
My question is (finally to the question):
Is this a bug in OpenGL? I’ve no doubt that the video card doesn’t support 512X512 textures. But should that cause this effect? I would think it would just cause slow down (which it does, from about 25+ fps, to 8-10fps).
Is it unreliable to use textures above 256X256? Is the OpenGL software rendering unreliable at 512x512? Is it unreliable at lower texture dimensions?
Any help on this issue would be greatly appreciated.

Mike Van Til

P.S. Before posting, I wanted to confirm it… I applied the same technique to 256X256 and it worked fine.

You can use glGetIntegerv(GL_MAX_TEXTURE_SIZE, &maxTextureSize) to get the maximum texture size supported.

Actually, using glTexImage2D with GL_PROXY_TEXTURE_2D is the suggested practice, but the above query should let you know what size textures should be handled.

Of course, it can always be a bug.

Are you using a Voodoo1, Voodoo2 or Voodoo3 board? There, the max texture resolution is 256x256!