After analyzing the link you sent me: Trying to process with OpenGL an EGLImage created from a dmabuf_fd - Jetson TX1 - NVIDIA Developer Forums
I found that the only thing I was doing different was the egl surface. I was not creating a surface at all, because I thought it wasn’t needed.
After doing this:
const EGLint context_attrib_list[] = {
EGL_CONTEXT_CLIENT_VERSION, 2,
EGL_NONE
};
const EGLint pbuffer_attrib_list[] = {
EGL_WIDTH, 2304,
EGL_HEIGHT, 1296,
EGL_NONE,
};
const EGLint config_attrib_list[] = {
EGL_SURFACE_TYPE, EGL_PBUFFER_BIT,
EGL_RENDERABLE_TYPE, EGL_OPENGL_ES2_BIT,
EGL_NONE
};
EGLDisplay egl_display;
EGLSurface egl_surface;
EGLContext egl_context;
EGLConfig egl_config;
EGLint matching_configs;
eglDisplay = eglGetDisplay ((EGLNativeDisplayType) 0);
eglInitialize (eglDisplay, 0, 0);
eglBindAPI (EGL_OPENGL_ES_API);
eglChooseConfig (eglDisplay, config_attrib_list, &egl_config, 1, &matching_configs);
egl_surface = eglCreatePbufferSurface (eglDisplay, egl_config, pbuffer_attrib_list);
egl_context = eglCreateContext (eglDisplay, egl_config, EGL_NO_CONTEXT, context_attrib_list);
eglMakeCurrent (eglDisplay, egl_surface, egl_surface, egl_context);
It kinda wored for me to render to the frame buffer I guess. However, the data I’m reading is like this:
49 49 49 255 52 52 52 255 57 57 57 255 60 60 60 255 56 56 56 255 56 56 56 255 56 56 56 255 56 56 56 255 56 56 56 255 56 56 56 255 56 56 56 255 56 56 56 255 55 55 55 255 55 55 55 255 55 55 55 255 55 55 55 255 56 56 56 255 56 56 56 255 56 56 56 255 56 56 56 255 56 56 56 255 56 56 56 255 56 56 56 255 56 56 56 255 55 55 55 255 55 55 55 255 55 55 55 255 56 56 56 255 56 56 56 255 56 56 56 255 56 56 56 255 56 56 56 255 56 56 56 255 56 56 56 255 55 55 55 255 55 55 55 255 55 55 55 255 55 55 ...
As you can see, the R,G and B always repeat, and the alpha is always 255. I don’t thin this is rigth and I can’t find an explanation. I’ve filled my frame buffer with RGBA data:
glBindTexture(GL_TEXTURE_2D, frameBufferTexture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, 0);
and I’m reading as RGBA:
glReadPixels(0, 0, 512, 512, GL_RGBA, GL_UNSIGNED_BYTE, r);
Also, another big problem I’m having is that this code only works when called outside GTK. If it’s called from GTK’s signal_render signal, I get:
0 0 0 255 0 0 0 255 ...
which suggests NvEGLImageFromFd
or glEGLImageTargetTexture2DOES
aren’t filling the texture and I’m reading from a blank texture.
So could it be that GTK is messing with my eglMakeCurrent
and therefore the surface being overwritten? Since I’ve found that a pbuffer surface is needed for NvEGLImageFromFd
or glEGLImageTargetTexture2DOES
to work, then the GTK failure suggests that some of the EGL context things are being overwritten.