I’m trying to implement video playing within my OpenGL app. I have some code that does most of what I want, but not using OpenGL. It creates a frame as a YUV overlay, which is then displayed. I need to find a way to display the YUV overlay as a texture within OpenGL. To the best of my knowledge OpenGL textures do not know about YUV (or YCbCr) and I must convert the frame to a RGB texture. This can be done in software, but since modern video cards (I have a GeForce4 Ti 4400) can do this much faster, I’d really like to be able to harness the power of the GPU.
I saw a SGI newsgroup post about drawing into a pbuffer with YUV source packing and RGBA destination. Is this the way to go? Is there a better way? Can someone point me to some sample code?