OpenGL ES for Android requires 2.0 commands for 3.0 context?

I’m rendering video using

glTexSubImage2D(GL_TEXTURE_2D,
                            0,
                            0,
                            0,
                            decodedFrame.width[j],
                            decodedFrame.height[j],
                            GL_LUMINANCE,
                            GL_UNSIGNED_BYTE,
                            null);

I was using GL_RED in place of GL_LUMINANCE because I thougth that, since im using OpenGL #version 320 es in my shader, I need to use OpenGL 3.x commands like GL_RED. But GL_RED wont work, only GL_LUMINANCE does.

Here’s how I’m creating the context:

        eglDisplay = egl.eglGetDisplay(EGL10.EGL_DEFAULT_DISPLAY);
        if (eglDisplay == EGL10.EGL_NO_DISPLAY) {
            throw new RuntimeException("eglGetDisplay failed");
        }

        int[] version = new int[2];
        if (!egl.eglInitialize(eglDisplay, version)) {
            throw new RuntimeException("eglInitialize failed");
        }

        EGLConfig eglConfig = chooseEglConfig();
        eglContext = createContext(egl, eglDisplay, eglConfig);

        eglSurface = egl.eglCreateWindowSurface(eglDisplay, eglConfig, surfaceTexture, null);

        if (eglSurface == null || eglSurface == EGL10.EGL_NO_SURFACE) {
            throw new RuntimeException("GL Error: " + GLUtils.getEGLErrorString(egl.eglGetError()));
        }

        if (!egl.eglMakeCurrent(eglDisplay, eglSurface, eglSurface, eglContext)) {
            throw new RuntimeException("GL make current error: " + GLUtils.getEGLErrorString(egl.eglGetError()));
        }

private EGLContext createContext(EGL10 egl, EGLDisplay eglDisplay, EGLConfig eglConfig) {
        int EGL_CONTEXT_CLIENT_VERSION = 0x3098;
        int[] attribList = {EGL_CONTEXT_CLIENT_VERSION, 3, EGL10.EGL_NONE};
        return egl.eglCreateContext(eglDisplay, eglConfig, EGL10.EGL_NO_CONTEXT, attribList);
    }

    private EGLConfig chooseEglConfig() {
        int[] configsCount = new int[1];
        EGLConfig[] configs = new EGLConfig[1];
        int[] configSpec = getConfig();

        if (!egl.eglChooseConfig(eglDisplay, configSpec, configs, 1, configsCount)) {
            throw new IllegalArgumentException("Failed to choose config: " + GLUtils.getEGLErrorString(egl.eglGetError()));
        } else if (configsCount[0] > 0) {
            return configs[0];
        }

        return null;
    }

As you can see Im using EGL_CONTEXT_CLIENT_VERSION, 3

I also imported everything from GLES30:

    import static android.opengl.GLES30.GL_TRIANGLE_STRIP;
    import static android.opengl.GLES30.glDrawArrays;
    import static android.opengl.GLES30.GL_CLAMP_TO_EDGE;
    import static android.opengl.GLES30.GL_LINEAR;
    import static android.opengl.GLES30.GL_STREAM_DRAW;
    import static android.opengl.GLES30.GL_TEXTURE0;
    import static android.opengl.GLES30.GL_TEXTURE_2D;
    import static android.opengl.GLES30.GL_TEXTURE_MAG_FILTER;
    import static android.opengl.GLES30.GL_TEXTURE_MIN_FILTER;
    import static android.opengl.GLES30.GL_TEXTURE_WRAP_S;
    import static android.opengl.GLES30.GL_TEXTURE_WRAP_T;
    import static android.opengl.GLES30.glActiveTexture;
    import static android.opengl.GLES30.glBindBuffer;
    import static android.opengl.GLES30.glBindTexture;
    import static android.opengl.GLES30.glBufferData;
    import static android.opengl.GLES30.glBufferSubData;
    import static android.opengl.GLES30.glGenBuffers;
    import static android.opengl.GLES30.glGenTextures;
    import static android.opengl.GLES30.glGetUniformLocation;
    import static android.opengl.GLES30.glTexImage2D;
    import static android.opengl.GLES30.glTexParameteri;
    import static android.opengl.GLES30.glTexSubImage2D;
    import static android.opengl.GLES30.glUniform1f;
    import static android.opengl.GLES30.glUniform1i;
    import static android.opengl.GLES30.GL_PIXEL_UNPACK_BUFFER;

But I needed to import GL_LUMINANCE from GLES20 for it to work

So why do I need to use GL_LUMINANCE?

What is the texture’s internal format? If it’s GL_LUMINANCE or GL_LUMINANCE_ALPHA, then you need to use GL_LUMINANCE as the (external) format.

Unlike core-profile desktop OpenGL, OpenGL ES 3.x supports luminance and luminance+alpha textures (although appendix G.1 indicates that it is a “legacy” feature which may be removed from future versions).

if (!initiatedTextures)
        {
            //LOG << "initiatedTextures";
            Log.d(LOG_TAG, "initiating textures");
            //TODO: delete these textures
            glGenTextures(1, textureId);


            glBindTexture(GL_TEXTURE_2D, textureId.get(0));
            glTexImage2D(GL_TEXTURE_2D,
                    0,
                    GL_LUMINANCE,
                    2304,
                    1296,
                    0,
                    GL_LUMINANCE,
                    GL_UNSIGNED_BYTE,
                    null); //decodedFrame->buffer[i]);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

            initiatedTextures = true;
        }

        glActiveTexture(GL_TEXTURE0);
        glBindTexture(GL_TEXTURE_2D, textureId.get(0));

        glTexSubImage2D(GL_TEXTURE_2D,
                0,
                0,
                0,
                2304,
                1296,
                GL_LUMINANCE,
                GL_UNSIGNED_BYTE,
                buffer); 

This is how Im creating and using the texture. If I simply exchange all 3 GL_LUMINANCE by GL_RED it wont work. It also works if I use GL_RGBA

That would give you a texture with only a red channel. You’d need to expand that with e.g. color.rrr in the shader to get the same result.

Since I’m using the shader to render YUV video, I only need the red component in each texture (I use 3 textures). I actually thought GL_LUMINANCE was the same as GL_RED. Isn’t it?

Please take a look at my shader:

#version 320 es

#ifdef GL_ES
precision mediump int;
precision mediump float;
#endif

uniform sampler2D tex_y;
uniform sampler2D tex_u;
uniform sampler2D tex_v;

in vec2 TexCoord;
out vec4 FragColor;

void main()
{
    vec3 yuv;
    vec4 rgba;
    yuv.r = texture(tex_y, TexCoord).r - 0.0625;
    yuv.g = texture(tex_u, TexCoord).r - 0.5;
    yuv.b = texture(tex_v, TexCoord).r - 0.5;

    rgba.r = yuv.r + 1.402 * yuv.b;
    rgba.g = yuv.r - 0.34413 * yuv.g - 0.71414 * yuv.b;
    rgba.b = yuv.r + 1.772 * yuv.g;
    FragColor = rgba;
}

I have used the same shader on desktop but with #version 330 core, where GL_RED worked but not GL_LUMINANCE, not Im very confused

No. Both formats have a single component, but sampling a GL_RED texture returns (x,0,0,1) whereas sampling a GL_LUMINANCE texture returns (x,x,x,1).

GL_LUMINANCE is more useful with the fixed-function pipeline; when using shaders, you can impose whatever interpretation you want on the data.

You’re only accessing the red component, so it doesn’t matter whether the texture is GL_LUMINANCE or GL_RED, so long as the application code follows any constraints imposed by the OpenGL version used. E.g. GL ES 2.0 doesn’t support GL_RED so you have to use GL_LUMINANCE. GL ES 3.x supports both but requires that the external format and type match the internal format. Desktop OpenGL 3+ core profile doesn’t support luminance as an internal or external format. Desktop OpenGL 1/2 and the compatibility profile support luminance and will convert almost any combination of external format and type to the internal format.