How to create a Texture2D using nv12 byte data?

win10& gtx2060&osg3.6.4

memcpy(_videoPlayData._videoImage->_pArray, pPicture->_data[0], sizeof(unsigned char)*pPicture->_ulPicWidth * pPicture->_ulPicHeight * 3 /2);

_videoImage->_pImage->setImage(_videoImage->_pVideoImageData->_picWidth, _videoImage->_pVideoImageData->_picHeight, 0, GL_LUMINANCE, GL_LUMINANCE, GL_UNSIGNED_BYTE, _videoImage->_pVideoImageData->_pArray, osg::Image::NO_DELETE);

You’ve provided almost no info here. What do you want to do with it? How do you want to address it? Why do you want GL to know about it in this format (which isn’t natively supported)?


for tips on composing a post that might gather more useful responses. Also check out:

If this is purely an OpenSceneGraph usage question, see their support forum and code for details.
If GL, you could consider creating a WxH LUMINANCE8 (or R8) texture for the Y plane and a (W/2)x(H/2) LUMINANCE8_ALPHA8 (or RG8) texture for the interleaved UV plane.