internal texture formats, color banding

I have a couple questions:

Are specific internal texture formats (like GL_RGBA8, not GL_RGBA) dependent on:

  1. the amount of available video memory?

  2. the OpenGL version?

  3. the format of the source texture?

  4. the display settings of your desktop?

  5. Also, is there anyway to query what internal texture format is being used
    after calling glTexImage2D?

Whenever I specify GL_RGBA8 for my texture creation for an alpha channel,
I get color banding, which means OpenGL ignored my request and used something
else.

But, if I specify a format without an alpha channel, the texture is smooth
and I don’t get color banding.

I am doing this:
glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA8, w, h, 0, GL_RGBA, GL_UNSIGNED_BYTE, buffer );

I thought specifying a specific texture format means OpenGL will use it,
no matter what, even if I have a crappy card?

1-4: Maybe, or maybe not. The specs say:

If a sized internal format is specified, the mapping of the R, G, B, A, and
depth values to texture components is equivalent to the mapping of the corresponding
base internal format’s components, as specified in table 3.15, and the memory
allocation per texture component is assigned by the GL to match the allocations
listed in table 3.16 as closely as possible. (The definition of closely is left up to the
implementation.
Implementations are not required to support more than one resolution
for each base internal format.)

  1. Yep.
glGetIntegerv(GL_TEXTURE_INTERNAL_FORMAT, &fmt);

Ah, great! I didn’t know about GL_TEXTURE_INTERNAL_FORMAT Thanks.

At least I can find out what that sneaky bastard OpenGL gave me.

I see it listed under glGetTexLevelParameter.

[This message has been edited by gator (edited 08-21-2003).]

Ah right, it is a tex parameter. Sorry.