GL_EXT_abgr

ABGR is there because you could put textures and stuff in that ordering that DirectX applications can use.

Some versions of DirectX have abgr ordering.

EXT_abgr is not interchangeable with the texture swizzla switch stuff. It will produce different results depending on the size of the components. e.g. 16 bit components.

I agree on that, but let us just check the overview of the EXT_abgr extension:

EXT_abgr extends the list of host-memory color formats. Specifically, it provides a reverse-order alternative to image format RGBA. The ABGR component order matches the cpack Iris GL format on big-endian machines.

Based on that, I’m pretty unsure whether implementations supporting the EXT_abgr extension do actually support this for floating point textures and such.

The extension itself is pretty ancient and the language used is pretty far from what an extension going into core GL 4.2+ should look like. What I want to say is that I wouldn’t suggest it to be included for the next release of OpenGL (this is what this post should be about).

The extension explicitly mentions that it targets the issue of the difference between LE and BE and that’s also what Pop N Fresh suggests as well. But we have already the SWAP_BYTES semantics for that already.

Also, I don’t see any real life use case scenarios when one would like to load textures with components ordered in ABGR order, except in case of byte-sized components where we already have the UNSIGNED_INT_8_8_8_8_REV format. [/QUOTE]

EXT_abgr extension does NOT mentions that it targets the issues of the difference between Little and Big endian.
EXT_abgr extension does NOT do this.

The text in EXT_abgr extension merely mentions that it has the same order as some other texture format if the machine it runs on is a Big Endian machine. It does not mean or imply that the extension has anything to do with endiannes.

Without further explanation:

EXT_abgr extends the list of host-memory color formats.  Specifically,
it provides a reverse-order alternative to image format RGBA.  The ABGR
component order matches the cpack Iris GL format on *big-endian* machines.

What do you mean?
Without further explanation.

And with that that extension has nothing to do with endianness is that I mean that the extension is not made to do things with or because of endianness. It was made so you could switch out textures with DirectX. Because that’s the order DirectX uses.

EXT_abgr wasn’t made because of DirectX, it’s a too old extension for that as at that time DirectX wasn’t really a match for OpenGL.

It was made because, as stated in the extension overview, to have such component ordering that matches the cpack Iris GL format on big-endian machines.

So it was not for DX at that time and now we already have alternative way to specify textures with ABGR component ordering formats.

I now see what you mean.
Thought that it was about something else.

Thanks for explaining.

Bump.

Have you actually tested this or are you just speaking theoretically?

I dispute that there is an endianness issue at all.

Firstly, we’re not talking about CPUs here, we’re talking about GPUs.

Secondly, the OpenGL spec clearly defines the bit layout for this type (and others). Even if you go back to the old GL_EXT_packed_pixels spec you will see clearly defined layouts. Endianness doesn’t even come into it.

Thirdly, this is not actually an “int” or “unsigned int” data type - it’s a 32-bit data type that may be conveniently represented by an int. If you want to you can use bitwise operations to write the data in directly. Or use another 32-bit data type.

Fourthly, the OpenGL specification does not contain one instance of the word “endian”. If the spec states that data is laid out in a certain manner then data is laid out in that manner and endianness is irrelevant. The implication is that GL_UNSIGNED_INT_8_8_8_8_REV is portable across different endianness.

Unless a test case can demonstrate that GL_UNSIGNED_INT_8_8_8_8_REV is problematic on different endian architecture, and unless that test case is otherwise correct (i.e. it doesn’t attempt to manipulate the usage of the type in order to prove a point) I call bull.

This issue is actually taken care of in OpenGL 3.3 with GL_ARB_texture_swizzle which effectively obsoletes GL_EXT_ABGR (along with GL_BGR and GL_BGRA).

I hadn’t gotten up to speed with the new standards and totally missed it’s existence.

(Do you make a habit of bumping dead threads so you can respond to months old posts and “call bull”?)

Not correct.

GL_ARB_texture_swizzle only works with GPU data.

GL_UNSIGNED_INT_8_8_8_8_REV, GL_BGR, GL_BGRA, GL_ABGR_EXT, etc. only work with CPU data, i.e. the data you specify in glTexImage and similar functions. You can allocate an RGBA8 texture and call TexSubImage to fill one half of it with BGRA data and the other half with RGBA data. It’s like swizzling for your CPU data.

The ‘format’ and ‘type’ parameters of glTexImage have nothing to do with the real texture format the GPU is using.

What he’s saying is that you could use the texture swizzle mask to rearrange the color components. So if your data was actually stored in RGBA order, but you uploaded it as ARGB, you cans use the swizzle mask to put the right components back into the right place.

So actually GL_EXT_abgr is subsumed by multiple separate functionalities already in core, so topic closed :slight_smile: