I am creating a custom camera preview using the GLSurfaceView, using OpenGl to render the frames given to me by the camera. I have the camera fully implemented and working how I would expect the camera to work with no fps loss and correct aspect ratios etc. But then the issue came when I needed to capture frames coming from the camera feed, my first thought was to use glReadPixles()
Using GLES20.glReadPixels() I find that some devices experience fps loss, it was mainly the devices with higher screen resolution this makes sense because glReadPixels needs to read more pixels with the higher resolution.
I did some digging and found others had similar issues with glReadPixels, and many suggested using a PBO, well using two of them acting as a double buffer which would allow me to read pixel data without blocking/stalling the current rendering process. I fully understand the concept of double buffering, I’m fairly new to OpenGL and need some guidance on how to get a double buffered PBO working.
I have found a few solutions to the PBO double buffering but I can never find a complete solution to fully understand how it interacts with GLES.
My implementation of the GLSurfaceView.Renderer.onDrawFrame()
// mBuffer and mBitmap are declared and allocated outside of the onDrawFrame Method
// Buffer is used to store pixel data from glReadPixels
mBuffer.rewind();
GLES20.glUseProgram(hProgram);
if (tex_matrix != null)
{
GLES20.glUniformMatrix4fv(muTexMatrixLoc, 1, false, tex_matrix, 0);
}
GLES20.glUniformMatrix4fv(muMVPMatrixLoc, 1, false, mMvpMatrix, 0);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, tex_id);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, GLConstants.VERTEX_NUM);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);
// Read pixels from the current GLES context
GLES10.glReadPixels(0, 0, width, height, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, mBuffer);
// Copy the Pixels from the buffer
mBitmap.copyPixelsFromBuffer(mBuffer);
GLES20.glUseProgram(0);