Texture Width/Height

The 1.5 OpenGL spec states that a texture specified using glTexImage3D (or 2D) must have dimensions as follows:

ws = 2n + 2bs
hs = 2m + 2bs

In otherwords, the texture must be base 2.

The 2.0 OpenGL spec states that a texture specified using glTexImage3D (or 2D) must have dimensions as follows:

ws = wt + 2bs
hs = ht + 2bs

In otherwords, the texture no longer has to be base 2.

My application must support cards/drivers from OpenGL 1.1 onwards (such as Intel Integrated Graphics systems who even new off the line today are only OpenGL 1.5-ish). So all fine and good, before allocating textures I can test the GL version (ie <= 2.0) to determine if base 2 textures are required and that will work as least most of the time.

My question though is this (as I have been burnt in the past in cases like this) has anyone seen or know of vendors/drivers who don’t support non-power-of-2 textures but who do report GL version 2.0 or greater when queried? I know it is not according to the “spec” but I have seen vendors do stuff like this (for other features) in the past (especially for mobile cards for instance).

I want to know if I need to expect the unexpected in my application? Anyone have some suggestions? Will I be ok simply testing the GL version or perhaps is there is some other way I can test if this is supported?

You could check for ARB-non-power-of-two extension

I do know of at least one vendor/card/driver combo that is at least OpenGL 2.0, reports the extension, but sends you through software emulation if you use it. OK, it’s the Nvidia GeForce FX series (5200, possibly more) and it may be fixed in more recent drivers (it’s been about 5 years since I tested this), but it’s definitely something to add to the list of items to be concerned about if you want more robust downlevel hardware support.

Actually the Fx series (like the Radeon9xxx) can use NPOT textures without software rendering - however they can’t mipmap without the software fallback.
Older generations can’t handle npot textures at all.
You could try to use texture rectangles as a compromise ? Just a thought.
The only thing left it to include a small benchmarking routine to create a NPOT texture and render and compare against a POT texture.

Ah, that makes sense. If it’s anything like D3D’s D3DPTEXTURECAPS_NONPOW2CONDITIONAL - which I suspect it is - additional restrictions would be that it can’t use GL_REPEAT or a compressed internal format without also going back to software.

That’s a very wise suggestion and would also handle those Intel chips that the OP is talking about, which may well support textures without the restrictions above but not do so optimally.

Your also quite right about the compression - this will result in software emulation for those older generation cards.