Direct GL_YUV, GL_YCbCr or YCoCg texture support ?

Why OPENGL haven’t a standardised texture support for standard TV digital video format such as YUV or YCbCr in 4:2:2 or 4:1:1 format ???

In first view, we have to use a color matrix for this
=> why this haven’t to be upgraded since a long time ???

Alls videos are made with a sort of YUV format, so we have to systematicaly convert it from YUV to RGB before send it to an OPENGL texture => it’s a lot of loose time for nothing …

Alls recents graphics cards seem have a support for YUV format, so this can’t be a technical problem … and only some add/mul aren’t really hard to integrate into a graphic chipset or the API : )

It’s really mysterious for me to understand why this haven’t yet already exposed into the OpenGL API (because this format exist since a lot of decades …)


It is the same thing with some hypotetics GL_MPEG1, GL_MPEG2 and/or a relatively new GL_MPEG4 supports for video texturing too :slight_smile:

This work already very fine and in real-time with avicodec that decode MPEG1/MPEG2/MPEG4 streams that I can easily put into OPENGL textures …

So … => WHY ???


I think that the Mesa extension YCBCR_422_MESA is something like I want and that Apple have make a similar (but really slow from what I have can read) extension in the past …

I test if this is exposed into my various PCs, EEEPC or Macintoshs that have alls differents graphics cards and drivers …


OpenGL is a 2D/3D graphics rendering API, not a video processing API. Is that to say you can’t do video processing using OpenGL? No. But it’s to say that there aren’t custom APIs geared around it.

I suggest you check out APIs such as the X.Org Xv library, which already has built-in support for playback with these video formats, with GPU acceleration when available, and APIs such as NVidia’s VDPAU, which provides GPU-assisted decode and playback of even very compute-intensive formats like h.264.

A number of vendors (i.e. Apple, …) have had support for rendering YCbCr 4:2:2 formats through the OpenGL API. For example on Mac OS X (going all the way back to 10.2.x) you have support for GL_APPLE_ycbcr_422 which allows one to read, store and optionally process texture data in (signed, unsigned) YCbCr 4:2:2. Considering that consumer HW has had direct (texture) support for 4:2:2 for quite some time the performance cost of this particular extension is effectively nonexistent.

Snow Leopard gives you more control over the server side conversion from YCbCr into RGB by way of GL_APPLE_rgb_422. The latter extension allows one to generate their own (client side) conversion from YCbCr into RGB via the programmable pipeline or through another method.

If you need support for other YCbCr formats (i.e. 4:2:0, 4:4:4, …) you are much better off using the existing (L, A, RGB) texture path and the programmable pipeline to accomplish this.

YUV422, 411, 420 and all sort of it is easy to implement via shaders, and render targets. For example to perform YUV422 to RGB you need:

  1. RGBA texture contains YUV422 samples
  2. FBO with RGB target texture bound
  3. draw screen aligned textured quad with appropriate fragment shader
    In shader you need to resolve sampling (422 in this case), collect YUV samples, convert to RGB (using conversion matrix provided by app) and write RGB pixels.

Then just use RGB version of texture.

In above example I choose RGBA texture to store YUV422 samples. YUV422 format looks like:

Y0 U Y1 V  Y0 U Y1 V  Y0 U Y1 V... and it perfectly fits in
R  G B  A  R  G B  A  R  G B  A

So… for a given YUV422 @ width x height, create width/2 x height RGBA texture and just fill raw YUV422 data. Also create target RGB texture width x height and bind it to FBO.

In shader (depending on even/odd gl_FragCoord.x coordinate) choose Y0 U V or Y1 U V and perform YUV to RGB conversion.

Other format and samplings is easy job to do.

I first thought: Great idea! And then I realize that you can store the YUV texture into several textures and decode them within a shader.

Maybe a dedicated format could be more efficient on the GPU side but I get the limitation will be on the image streaming.

I am pretty against the idea to have YUV support in OpenGL.
I can understand that in 2.x era. But not today with all the shader functionality.

The problem with YCrCb 422 & co. is that there are so many combinations or memory format. (CrYCbY, YCrYCb,…)

There are several color matrices to convert to RGB (CCIR601, CCIR709, FullRange).

Then you can do have several ways to upscale from 422 to 444. Using simple upscale or linear filtered upscale or more sophisticated upscaling.

Then you have 8, 10, 12 bits per channel.

Hardwiring all this into driver is pretty agains the idea to have few fixed func. features and more programmable features.

I’d more understand to make some utility library to handle all this.

If hardware supports it directly, there is no reason for it not to be included in the API. It’s in Apple’s implementation and Mesa, so it probably should be standardized. It’s worth noting that Xv uses 3D hardware these days anyway.

The video decoder is a separed component from the GPU for some company. I don’t know about ATI or nVidia but even if it’s on the same chip it doesn’t means that the video decoded is really integrated with the rest of the chip. “It lives by it’s own”.

Responses about YUV shaders seem to me such as an emission that we name “Les chiffres et les lettres” :slight_smile:

I explain …

We have for example only the numbers 1, 7, 9, 5, 10, 8, 3 and 2 to use
And we have for example to found the number 92

7-1 = 6
10 * 8 = 86
6 + 86 = 92 => yes, good we have found the good formule that give the good final value :slight_smile:

In OPENGL, this seem to be about the same case but it’s more comic because we have to handle relatively complex pixel’s shaders that combine formulas that are a lot more difficult for to make the YUV->RGB conversion

But something like

 glTexImage2D( target, level, GL_RGB, width, height, border, GL_YUYV, GL_BYTE, texels) 

seem to me very very more simple to handle that a pixel shader and doesn’t present the lot of incompatibilities between different card/vendor/opengl shader’s versions that exist today (when and only when the card/driver support pixel shader of course …)


Note that I have volontary make an error on the “chiffres et les lettres” solution

This is because I have found a lot of bad YUV to RGB conversion formulas on the net … :slight_smile:

=> the simplest response can be too the better …


Maybe you did not understand my claim. There are about 50 (or more?) combinations/formats/standards for YUV packing.
Should OpenGL implement all of them?

But ask yourself. Is there anything that blocks you to write application that displays image in YUV format? Is it slow?
The answers are: no, it can be done, and it is as fast.

Don’t worry to write simple shader. In the end you realize that you add few more things to the shader. Like gamma correction, deinterlacing, … . Then the 5 lines of code for YUV->RGB is nothing.

For an idea about what is already standardised in OPENGL :

Table 1: OpenGL Internal Texture Formats. Each internal texture format has a corresponding base internal format and its desired component resolutions.
Sized Base R G B A L I
Internal Format Internal Format bits bits bits bits bits bits
R3_G3_B2 RGB 3 3 2
RGB4 RGB 4 4 4
RGB5 RGB 5 5 5
RGB8 RGB 8 8 8
RGB10 RGB 10 10 10
RGB12 RGB 12 12 12
RGB16 RGB 16 16 16
RGBA2 RGBA 2 2 2 2
RGBA4 RGBA 4 4 4 4
RGB5_A1 RGBA 5 5 5 1
RGBA8 RGBA 8 8 8 8
RGB10_A2 RGBA 10 10 10 2
RGBA12 RGBA 12 12 12 12
RGBA16 RGBA 16 16 16 16

So, why not a little new GL_YUYV24 or something like this at the end of this ???

Heu … have you really make a day only one shader that can handle alls formats that are present here ??? :slight_smile:


Note that the LUMINANCE internal format is already standardised :slight_smile:

=> so we have ONLY the UV to add at this (that can easily be computed by a very simple yuv2rgb table association …)


Cheap arguments.

Everybody knows OpenGL has lot of legacy stuff.
All the intensity and luminance formats are dead end because they cannot be used as render target. Most of the formats you listed are deprecated. Some replaced by GL_ARB_texture_rg

Regarding your GL_YUYV24.
Look here:
There are dozens of YUV formats. Each of them can be interpreted in a dozen different ways. (color matrix, upscale filtering, …)

OpenGL is not a format conversion library. It is a tool to render stuff. And as far as it can render YUV video it does its job.

Please think of it. Write you shader. You will be happy that you have full control of your pipeline.

Mfort, please test to understand that I don’t speak about a multitude of internals formats, but only one …
=> how many different RGB(A) format OpenGL handle please ?

On another side, I see about what is the GL_ARB_texture_rg …


I know you want only one … right now. Tomorrow another one … In 6 months you will ask for 50 combinations.
OpenGL should be generic enough but still related to 3D rendering.

OpenGL has several extensions regarding YUV formats:
none of them gained ARB “stamp” nor reached core. It indicates something.

(my last post in this thread)

And the shufps SSE instruction already exist for example :slight_smile:


Thanks, the GL_EXT_422_pixels seem to be a good start …