sRGB default framebuffers unreliable feature?

When I create an sRGB default framebuffer on Android (using EGL_KHR_gl_colorspace + EXT_sRGB_write_control), the contents are not properly converted from linear to sRGB as expected (GL ES 3.1, on Mali-T720):

const EGLint window_attribs[] = { EGL_GL_COLORSPACE_KHR, EGL_GL_COLORSPACE_SRGB_KHR, EGL_NONE };
 egl_surface = eglCreateWindowSurface(egl_display, config, egl_window, window_attribs);
if (egl_surface == EGL_NO_SURFACE) {
 printf("[!] eglCreateWindowSurface failed: %i", eglGetError() );
 return false;
}

I verify the encoding of the default framebuffer like this:

glEnable(GL_FRAMEBUFFER_SRGB_EXT);
GLint encoding = -1;
glGetFramebufferAttachmentParameteriv(GL_FRAMEBUFFER, GL_BACK, GL_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING, &encoding);
if (encoding == GL_LINEAR) printf("Framebuffer: GL_LINEAR\n");
if (encoding == GL_SRGB  ) printf("Framebuffer: GL_SRGB\n");
glDisable(GL_FRAMEBUFFER_SRGB_EXT);

On Android, it prints GL_LINEAR, while that should be GL_SRGB, and the framebuffer contents is not converted from linear to sRGB.

On desktop (GLX, NV GTX1060), the same codebase works properly, yet glGetFramebufferAttachmentParameteriv also reports GL_LINEAR instead of GL_SRGB, while the framebuffer contents are properly converted from linear to sRGB. This seems to be a known but unfixed issue: https://forums.developer.nvidia.com/t/gl-framebuffer-srgb-functions-incorrectly/34889/10

sRGB encoded textures (including sRGB ASTC textures) are properly handled on both said hardware (even on desktop; NV driver apparently silently accepts sRGB ASTC texture data, even if otherwise unsupported in hardware). All my code is GL error free.

My question: should I avoid sRGB default framebuffer functionality? It seems to be unreliable on multiple platforms (ARM/Mali and NV/GLX), and there’s no information (like sample code) beyond the EGL_KHR_gl_colorspace spec to be found. I suspect virtually all GL (ES) apps render to an sRGB FBO, then rendered to the default framebuffer, and thus these issues have gone unreported?

Has anyone successfully got this feature working (any platform)? Anything I should double check or verify?

I will probably render to an sRGB FBO as workaround, but I was hoping to save some memory by only using the default framebuffer.

Also, this blog post is a fairly good resource when it comes to sRGB & OpenGL:

And while on topic, two more related questions:

  1. OpenGL ES 3 does not seem to support the GL_SRGB8 FBO format, but does support GL_RGB8. Why?
    Seems the only sRGB option on ES 3 is GL_SRGB8_ALPHA8, correct?
    See table 1: https://www.khronos.org/registry/OpenGL-Refpages/es3.0/html/glRenderbufferStorage.xhtml

  2. glGenerateMipmap fails on sRGB textures on GL ES 3 on Mali, this is not documented. Why? Mipmap generation for sRGB textures works properly on NV GLX desktop.

“Because that’s what the spec says”.

SRGB8_ALPHA8 is colour-renderable, SRGB8 isn’t. You can have SRGB8 textures but you can’t render to them and you can’t have SRGB8 renderbuffers.

That’s the only colour-renderable sRGB format. FWIW, RGB8 is the only colour-renderable format where the number of bits isn’t a power of two. All of the other colour-renderable RGB formats are packed into 16 or 32 bits. I’m guessing that RGB8 is so widespread that they made an exception for it.

The spec (§3.8.10) only says that (for GenerateMipmap) the texture must be colour-renderable and texture-filterable. So it’s supposed to work for SRGB8_ALPHA8, but not SRGB8.

The (desktop) OpenGL 4.6 spec doesn’t appear to have that restriction, although SRGB8 isn’t colour-renderable there either.

Ah, enlightening. I wasn’t yet familiar with the significance (or exact meaning) of the phrase “color-renderable”. Thanks!

As to SRGB8 FBO format, the ARM Best Practices Guide repeats in numerous places that more compact formats are always better (on Mali at least), and alignment is best ignored, I was hoping I overlooked a possible extension, but the ES 3 docs are pretty clear indeed.

Unfortunately, rendering to FBO and converting to sRGB on my current target hardware (Android/Mali-T720) incurs a significant performance impact (~50-60 fps drops to sub ~30fps, of which about 10fps just for pow(c,1.0/2.2) on 1280x720 FBO).

It would be nice if someone could tell me if implicit linear to sRGB conversion using EGL_KHR_gl_colorspace actually is possible, and how it is supposed to be done.

In case you haven’t already check this, you might have a full pipeline flush going on here. Are you multibuffering your FBOs and render targets?

Mali drivers may be different, but on PowerVR drivers at least, framebuffers/FBOs are (or were a few years back) the logical container for all in-flight rendering work. Re-using an FBO before all of the work previously associated with that FBO has completed on the GPU (which occurs a frame or two after the CPU finishes queuing the work) will trigger a pipeline flush (i.e. big stall), which can easily halve your frame rate. This is a big deal on mobile/tiled GPUs where CPU/GPU parallelism is required for acceptable performance.

Yes, I’m considering that, the Mali Best Practices Guide explicitly recommends multi buffering.
But I don’t know if much GPU memory will be left for application content…
I may very well have a pipeline stall in there, but shader complexity is also measurable in performance on this relatively dated hardware, I may not be hitting it every frame depending on content complexity.

At any rate, sRGB conversion is repeatedly claimed (in official docs) to have no performance cost on Mali hardware, hence my preference.

I found that EGL_KHR_gl_colorspace was mentioned to be broken in Android 8.1 by someone at Unity (https://forum.unity.com/threads/is-linear-lighting-more-cpu-intensive-than-gamma-lighting-on-mobile.651073/), I’d like to assume it was fixed since. It would be weird if it remains broken (in Android 9 on my test hardware), since linear -> sRGB conversion is the only and specific purpose of this tiny extension.