I read in the manpages
params returns a single integer value indicating the number of available compressed texture formats. The minimum value is 4.”

When I query with glGetIntegerv it returns 3.

The man page is wrong.

It’s assuming that the RGTC formats promoted to the core should be exported.
But they shouldn’t be, per ARB_texture_compression_rgtc Issue 19.

This query is essentially useless. If you want to know what formats are supported you should look at the core version and extensions.

I suppose that’s why the first ‘format’ it returns equals 0x83F0 which is not defined as anything in my gl3.h file?

0x83F0 = GL_COMPRESSED_RGB_S3TC_DXT1_EXT, per EXT_texture_compression_s3tc.

This format that is (almost) universally supported isn’t promoted into the core, so it’s not in gl3.h.

Thanks for the quick reply :smiley:

Edit: Mis-spelling in the man pages threw me off on this one. The have GL_DRAW_FRAMEBFUFER_BINDING instead of FRAMEBUFFER :eek:

Here’s a new one. These are in gl3.h but querying with GetIntegerv leaves my GLint unchanged:

glGetIntegerv( GL_MAX_VERTEX_STREAMS, lint );