How to use OpenGL ES to render and output YUV buffer?

In fragment_shader Code

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : enable
#extension GL_EXT_YUV_target : require
precision mediump float
in vec4 texcoord0
out vec4 outColor
uniform __samplerExternal2DY2YEXT diffuse
void main() {
    outColor = texture( diffuse, texcoord0.xy) ;
}

Using this code, i can import YUV image without yuv_2_rgb.
But I don’t know how to output YUV buffer.

Using EGL pbuffer or pixmap surface?

I sincerely hope that I can get some suggestions from you.

Best Regards

According to the EXT_YUV_target specification:

I have done this, but the output image is RGBA

i am worried whether it is caused by using window_surface_config to eglCreateWindowSurface

// like  this 
EGLint configSpec[] = {
            EGL_RED_SIZE, 8,
            EGL_GREEN_SIZE, 8,
            EGL_BLUE_SIZE, 8,
            EGL_ALPHA_SIZE, 8,
            EGL_RENDERABLE_TYPE, EGL_OPENGL_ES2_BIT,
            EGL_SURFACE_TYPE, EGL_WINDOW_BIT,
            EGL_NONE
    };

eglCreateWindowSurface(display, configSpec, nwin, 0);

So i want to know , how to get YUV buffer by off-screen rendering

Window config doesn’t come into it. YUV rendering is only available when rendering to a FBO with a YUV EGLImage attached as the colour buffer.

my goal is to achieve outputing the yuv buffer with a YUV EGLImage

it has done that importing YUV EGLImage through __samplerExternal2DY2YEXT

but i don’t know how to output yuv buffer

So, I sincerely hope that you can explain the code of steps in detail

Best Regards

That’s for sampling (reading) YUV textures in the shader.

For rendering, you declare the fragment shader output with the layout(yuv) qualifier and bind a FBO with a YUV EGLImage bound to the colour buffer.

Thanks, based on your answer, I improved the code.

vertex_shader

#version 300 es
layout(location = 0) in vec4 position;
layout(location =1) in vec4 in_tex0;
out vec4 texcoord0;
void main() {
texcoord0 = in_tex0;
gl_position = position; }

In fragment_shader, i want to output YUV buffer

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : enable
#extension GL_EXT_YUV_target : require
precision mediump float;
in vec4 texcoord0;
layout(yuv) out vec4 outColor;
uniform __samplerExternal2DY2YEXT diffuse;
void main() {
outColor = texture(diffuse, texcoord0.xy);  }

I wrote a new code now that I learned more how opengl works.
But i noticed that glDrwuElements fails.
The glGetError was 1282. Do you know what’s heppening?

glGenTextures(1, &tex);
glBindTexture(GL_TEXTURE_EXTERNAL_OES, tex);

glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MAX_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_EXTERNAL_OES,GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

glEGLImageTargetTexture2DOES(GL_TEXTURE_EXTERNAL_OES, EGLImage);

glGenFramebuffers(1, &fbo);
glBindFramebuffer(GL_FRAMEBUFFER, fbo);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_EXTERNAL_OES, tex, 0)

glVertexAttribPointer(0,3,GL_FLOAT,GL_FALSE, 3*sizeof(GLfloat), vertices);
glVertexAttribPointer(1,2,GL_FLOAT,GL_FALSE, 2*sizeof(GLfloat), texcoords);
glEnableVertexAttribArray(0);
glEnableVertexAttribArray(1);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_EXTERNAL_OES, tex);
glUniformli(glGetUniformLocation(program, "diffuse"), 0);

glDrawElements(GL_TRIANGLES, 6, GL_UNSIGNED_SHORT, indices);
// glGetError() == 1282

1282 = 0x0502 = GL_INVALID_OPERATION.

That doesn’t narrow it down very much. The specification and reference pages list three specific cases:

But that list is far from exhaustive. It’s fairly typical for invalid state combinations to result in an error from the next draw call to be issued.

Also: you appear to be binding the texture tex both as a framebuffer attachment and as a texture. That results in undefined behaviour:

Hey! I also want to render something into a YUV FBO - does this only work for YUV 444, or is YUV 420 also supported?

It’s implementation-defined. Some of the language in the extension specification implies that rendering to a 420 image is a possibility, but it doesn’t explicitly require this to be supported.