YUV overlay conversion for video

I’m trying to implement video playing within my OpenGL app. I have some code that does most of what I want, but not using OpenGL. It creates a frame as a YUV overlay, which is then displayed. I need to find a way to display the YUV overlay as a texture within OpenGL. To the best of my knowledge OpenGL textures do not know about YUV (or YCbCr) and I must convert the frame to a RGB texture. This can be done in software, but since modern video cards (I have a GeForce4 Ti 4400) can do this much faster, I’d really like to be able to harness the power of the GPU.

I saw a SGI newsgroup post about drawing into a pbuffer with YUV source packing and RGBA destination. Is this the way to go? Is there a better way? Can someone point me to some sample code?


Is it really a problem converting to rgb? you will have to load/uncompress the data somewhere along the line, just put your conversion in there. Can’t imagine it slowing it down.