Hi, and thank you for reading these few questions.
First, my current OpenGL application displays some interactive scene and sends -asynchronous- copies of the framebuffer (glReadPixels) to a remote client that displays them as a 2D Texture, figuring a video stream.
I also know it is possible to make a video from your scene, building it frame by frame or displaying it live, even though I have never done it myself.
But what I am trying to do (and don’t know how indeed) is feeding a video stream from my scene to provide live video to some remote client(s).
I think (but am not sure about it) that usual video streaming does not provide “real time” video, but a slighly delayed bufferized one instead.
Then, what I am really wondering is :
- Is it possible to feed some live (ie real time) video stream from my OpenGl Scene (instead of displaying it to my screen first and copying/sending it then). And indeed, how?
- Does such a video has to be frame by frame one or could it be interlaced?
- Is it possible to use oncard compression to reduce the size of the transmitted data? (even if my card does not provide such a feature, I consider buying a new one according to this point)
- Do you think it may be more efficient to feed a network video stream than sending via the same network a Screen copy (glReadPixels) to a remote host? This question does not invalidate previous ones, because even if it is not, I will still need to learn how to do some video stream from OpenGL.
I thank you for your time and for your futur answers.