We are developing a function to capture a screen using glReadPixel. We currently are dealing with a bug that captures a red screen when calling the screenshot function. The issue is only reproducible on some android devices. So far, on HUAWEI BTW-DL09 (Android 7) and Galaxy Note 10 plus SM-N975F (Android 10), the bug occurs. Below is the block code and debug logs when the error is produced.
Is there an issue with the camera resolution size that is causing this problem? We would love any input if anyone has experienced anything similar or has a solution!
For instance, if the default framebuffer is the read framebuffer, this framebuffer is an EGL double-buffered window surface, and the active glReadBuffer() is GL_BACK, are you calling glReadPixels()after rendering but before you call eglSwapBuffers()?
And for another instance, if you’re trying to read from a window surface’s back buffer after calling eglSwapBuffers(), check out what EGL_SWAP_BEHAVIOR is set to. Is it EGL_BUFFER_PRESERVED or EGL_BUFFER_DESTROYED? Note that this behavior may or may not be controllable. See:
Beyond the misbehavior issue you’ve asked about, I hope I don’t need to tell you how inefficient doing a blocking glReadPixels() call is, especially on mobile GPUs. There are better ways to do this if you don’t want the massive GPU pipeline bubble this is going to cause.
2. What is the active GL_READ_BUFFER within that framebuffer?
We checked GL_FRONT and GL_BACK.
3. What is the format of that buffer within that framebuffer? (the EGL config if it’s an EGL surface)
default value
4. Are you sure that reading from that buffer is allowed?
I believe so, since we are able to read from the buffer on different devices.
5. Are you sure that the buffer’s contents should be valid at this point?
I don’t know, but we are able to read from the buffer on different devices so I assume so.
6. Are you checking for GL errors before and after?
Yes.
We are unsure what causes the misbehavior since it behaves properly on other android devices. If you have any ideas what might be causing it I’d love to hear back!
Ok, so the default framebuffer. Meaning the EGL surface that you’ve targetted rendering to. Which may be a window, a pbuffer, a pixmap, or possibly other surface type. Which is it?
Which one of those two is active when you see the behavior you describe?
There isn’t a default. What type of EGL surface is your default framebuffer? Window, pbuffer, or pixmap? What is the format of that EGL surface? What is the format of the selected color buffer within that EGL surface? Is it a double buffered surface? What’s the EGL config that you’re using look like?
That’s not how it works. Different GPUs, different drivers, different OS versions and software layers = different capabilities and different behavior. Though through EGL you can query and control that behavior.
Same response here as on the last. Specific capabilities and behavior on device A with driver B doesn’t guarantee the same capabilities and behavior on device C with driver D.
Where are you calling glReadPixels() w.r.t. your rendering for a frame and the subsequent eglSwapBuffers() for that frame?
Thank you for your reply. We had some additional follow-up questions regarding how to read the buffer format based on the questions you asked us.
Ok, so the default framebuffer. Meaning the EGL surface that you’ve targetted rendering to. Which may be a window, a pbuffer, a pixmap, or possibly other surface type. Which is it?
How do we figure out which EGL surface we are targetted to render to?
Which one of those two is active when you see the behavior you describe?
GL_FRONT and GL_BACK both reproduce the behavior that we described.
There isn’t a default. What type of EGL surface is your default framebuffer? Window, pbuffer, or pixmap? What is the format of that EGL surface? What is the format of the selected color buffer within that EGL surface? Is it a double buffered surface? What’s the EGL config that you’re using look like?
How do we figure out which EGL surface is the default framebuffer? Is there a method to find the format value?
That’s not how it works. Different GPUs, different drivers, different OS versions and software layers = different capabilities and different behavior. Though through EGL you can query and control that behavior.
This is a follow-up to the previous question. How do we get permission to obtain the different GPU, driver, OS version, etc.? Are there resources for EGL query to control behavior based on different GPU/driver? We believe this may be the underlying issue.
Where are you calling glReadPixels() w.r.t. your rendering for a frame and the subsequent eglSwapBuffers() for that frame?
We don’t know. Since the screenshot is taken on runtime, and that’s where glReadPixels() is called, we are not sure when eglSwapBuffers is called. If there is a way to figure this out we would love to know.
If you’re making an app to capture a screen that has been rendered by other apps, you’re probably going to need to use a different API. glReadPixels can only read from framebuffers which are accessible to the application, which doesn’t necessarily include the system-wide screen. An OS may give each application its own framebuffer then composite those to generate the final display.
If you’re making an app to capture a screen that has been rendered by other apps, you’re probably going to need to use a different API
We’re making an capture screen function that is being rendered by our own application. If we continue to use OpenGL, is there a reason why it still renders a red screen?
Either you’re trying to capture stuff that wasn’t written by your application, or you’re reading the wrong framebuffer, or you’re reading it at the wrong time. If you’re rendering to a FBO, you can read it at any point. If you’re reading from the back buffer of a system framebuffer, you need to capture it after rendering but before any “swap buffer” call (e.g. eglSwapBuffers). If you’re reading from a front buffer, that can be invalidated by the system at any time.