eglInitialize failed

Hi Community,
I am trying to run a testing code for EGL, but it failed in initialization. Here is my testing code.

#include <iostream>
#include <EGL/egl.h>

int main()
{
    EGLDisplay display = eglGetDisplay( EGL_DEFAULT_DISPLAY );
    if (display == EGL_NO_DISPLAY)
    {
        std::cout << "unable to open connection to local windowing system." << std::endl;
    }

    EGLint major, minor;
    if (!eglInitialize(display, &major, &minor))
    {
        std::cout << "unable to initialize EGL." << std::endl;
    }

    std::cout << "egl error code : " << eglGetError() << std::endl;

    return 0;
}

And the output is :

unable to initialize EGL.
egl error code : 12289

My platform is CentOS 7.6 without X server. Besides, I am trying to run the testing code on DCU, which is the GPU of AMD.

Let me know if you want more information.
Thanks for your help.

Try this instead:

#include <stdio.h>
#include <stdlib.h>
#include <X11/Xlib.h>
#include <EGL/egl.h>
...

  const char *display_str = getenv( "DISPLAY" ) ? getenv( "DISPLAY" ) : ":0.0";

  Display *display = XOpenDisplay( display_str );
  if ( !display )
  {
    printf( "ERROR: XOpenDisplay( %s ) failed.\n", display_str );
    exit( 1 );
  }

  EGLDisplay egl_display = eglGetDisplay( display );
  if ( !egl_display )
  {
    printf( "ERROR: eglGetDisplay() failed.\n" );
    exit( 1 );
  }
...

the output changed into :

No protocol specified
ERROR: XOpenDisplay( :0.0 ) failed.

The user you’re running as can’t connect to the X server running on the machine. Fix that first. Then retry your EGL connect test.

Once apps like glxgears and glxinfo run properly, you’re ready to retry your EGL test.

Which driver and driver version are you running? I have no idea if AMD drivers support creating EGL contexts without access to the X server, like NVIDIA does. However, if they do, there may be some tricks to ensuring that the user that you’re running as can connect to the underlying graphics driver. You can investigate this using strace. See this post where I detailed a similar investigation I did to figure out what access NVIDIA’s EGL support needed in order to create an EGL context without an X server connection.

There is no X server running on my DCU server. When I try to run glxgears or glxinfo, the output is :
Error: unable to open display.
So I try the testing code on another linux server, which is also based on CentOS. There is no GPU or DCU on it, and also without X server. After testing, I got this:
ERROR: XOpenDisplay( :0.0 ) failed.
I am trying to run glxgears of glxinfo on this linux server. There is some questions when I using EGL:

  1. When there is no GPU on the server, whether EGL would use CPU or not?
  2. I also run the testing code on Ubuntu system, it performs well. So I wonder is it a system problem that caused the wrong result.
    Looking forward to your reply.

That is expected. You can’t connect to an X server if it doesn’t exist. You need an X server running. This provides the graphical display capability. You should get a similar error when running xdpyinfo.

Again, that’s the expected result. You can’t connect to an X server if it doesn’t exist.

So you want to initialize EGL on a Linux system that both:

  1. Does not have an X server running (for window display capability) and
  2. Does not contain a GPU.

Obviously, you won’t be connecting to an EGL implementation in the graphics driver provided by your GPU vendor, because there isn’t one. You’ll need to use a library or environment that provides EGl and either OpenGL and/or OpenGL ES for completely CPU-based off-screen rendering purposes. Mesa3D might do this. Check with them.

Thanks for your reply.
Yesterday, I have tried to run the testing code on a personal computer, with centos 8 on it, and the glxgears runs properly. Of course the X server is well. After testing, everything is ok.
As you mentioned before, EGL is support for off-screen rendering without X server. However, when I try it again on a server without X server, it failed as the same. It seems like the device could not find a display for EGL. I have no idea about the relationship between display and X server. I wonder whether the display initializes correctly or not without X server.
By the way, I have tried this . But it still initializes failed.

No. While EGL can in some some circumstances be used for off-screen rendering via OpenGL or OpenGL ES without a window system (e.g. w/o an X server running in the Linux world), its primary purpose is as an interface between OpenGL or OpenGL ES and the native window system.

EGL is very commonly used on Android mobile platforms, but can be found on PC as well.

Think of a “display” as a connection to the windowing system. And in the X server world, a display can have multiple “screens”. These screens usually correspond to the monitors on your system.

However, some graphics drivers have the ability to provide access to a virtual display (in cases where there is no underlying windowing system) for off-screen rendering purposes only. NVIDIA’s graphics drivers for example support this.

  1. Buy an NVIDIA GPU.
  2. Install it and NVIDIA Graphics Drivers on your system.

And then you’ll be able to perform off-screen rendering via EGL and OpenGL or OpenGL ES, without an X server running on your system, when running directly on the machine (not in a VM).

This EGL + OpenGL / OpenGL ES rendering support without an X server is a feature of the NVIDIA graphics drivers specifically, not a feature of Linux in general.

Hi,
I have changed into a NVIDIA GPU Cluster, and the version of NVIDIA Graphics Drivers is 440.33.01. Also I have tried to run the testing code on it. When I was running the code on login node, it was well. However once I submit it into the computing node for running, it went wrong. The output is :

ERROR: XOpenDisplay( localhost:14.0 ) failed.

I have no idea what’s wrong with it. Looking forward to your reply.

ERROR: XOpenDisplay( localhost:14.0 ) failed suggests two things:

  1. You are trying to use the EGL connect method that requires a connection to the X server, and
  2. You are not directly logged into the X server / display manager on that system, but rather are running in an SSH connection to or a VM session on that system.

If you want to create an EGL connection without a connection to the X server, you don’t want to use the X server connection method:

    Display *display = XOpenDisplay();
    ...
    EGLDisplay egl_display = eglDisplay( display )

You want to use the:

    EGLDisplay egl_display = eglGetDisplay( EGL_DEFAULT_DISPLAY );

method described in this article:

Hi, I’m in the same situation. Have you solved it yet?
my platform is ubuntu16.04 without X, and nvidia driver is 440.33.01 too.
i also use eglGetDisplay(EGL_DEFAULT_DISPLAY) to get a display.
i have a bad result that eglInitialize failed.
looking forward to your reply.

See the thread linked below, and in particular that post.

In order to:

  • create a GPU-accelerated EGL context
  • via the NVIDIA graphics driver on Linux
  • using the display eglGetDisplay( EGL_DEFAULT_DISPLAY ) (instead of XOpenDisplay())

the Linux usermode driver/libs need to be able to open the NVIDIA Linux driver’s kernel special files. Namely:

  • /dev/nvidia0
  • /dev/nvidiactl
  • /dev/nvidia-modeset
  • /dev/dri/card0 , and
  • /dev/dri/renderD128

If your system is setup similarly to the one here:

> ls -l /dev/nvidia0 /dev/nvidiactl /dev/nvidia-modeset /dev/dri/card0 /dev/dri/renderD128
crw-rw----+ 1 root video 226,   0 Aug  5 17:34 /dev/dri/card0
crw-rw----+ 1 root video 226, 128 Aug  5 17:34 /dev/dri/renderD128
crw-rw----+ 1 root video 195, 254 Aug  5 17:34 /dev/nvidia-modeset
crw-rw----+ 1 root video 195,   0 Aug  5 17:34 /dev/nvidia0
crw-rw----+ 1 root video 195, 255 Aug  5 17:34 /dev/nvidiactl

that just involves adding your username to the video group in the /etc/group file. You may then need to log out and back in to activate this change.

Thank you very much. there is no /dev/dri. that’s the reason why i failed.

Ok, sounds good.

If you haven’t already, run your app under strace (strace <appname> <args>), direct the output to a file, and look for refs to those devices. If you see the driver trying and failing to open one of them, that could be it.

I encountered the same error 12289 here, but my gpu drivers are normal. I can call egl with python code and it works normally. In c++, eglInitialize initialization always returns error 12289. The following is my code

 static const EGLint configAttribs[] = {
     EGL_SURFACE_TYPE, EGL_PBUFFER_BIT,  // 如果使用 Pbuffer,保留
     EGL_BLUE_SIZE, 8,
     EGL_GREEN_SIZE, 8,
     EGL_RED_SIZE, 8,
     EGL_DEPTH_SIZE, 8,
     EGL_RENDERABLE_TYPE, EGL_OPENGL_BIT,
     EGL_NONE
 };

 static const int pbufferWidth = width;
 static const int pbufferHeight = height;

 static const EGLint pbufferAttribs[] = {
     EGL_WIDTH, pbufferWidth,
     EGL_HEIGHT, pbufferHeight,
     EGL_NONE,
 };

 // 初始化 EGL
 EGLDisplay eglDpy = eglGetDisplay(EGL_DEFAULT_DISPLAY);
 if (eglDpy == EGL_NO_DISPLAY) {
     std::cerr << "无法获取 EGL 显示 (Error: " << eglGetError() << ")" << std::endl;
     return;
 }
 std::cout << "EGLDisplay: " << eglDpy << " (Error: " << eglGetError() << ")\n";

 EGLint major, minor;
 if (eglInitialize(eglDpy, &major, &minor) == EGL_FALSE) {
     std::cerr << "EGL Initialization failed (Error: " << eglGetError() << ")" << std::endl;
     return;
 }
 std::cout << "EGL initialized: " << major << "." << minor << " (Error: " << eglGetError() << ")\n";

 // 选择 EGL 配置
 EGLint numConfigs;
 EGLConfig eglCfg;
 if (eglChooseConfig(eglDpy, configAttribs, &eglCfg, 1, &numConfigs) == EGL_FALSE || numConfigs == 0) {
     std::cerr << "选择 EGL 配置失败 (Error: " << eglGetError() << ")" << std::endl;
     return;
 }
 std::cout << "EGLConfig: " << eglCfg << " (NumConfigs: " << numConfigs << ", Error: " << eglGetError() << ")\n";

 // 创建 Pbuffer 表面,或直接跳过以实现 surfaceless 渲染
 EGLSurface eglSurface = eglCreatePbufferSurface(eglDpy, eglCfg, pbufferAttribs);
 if (eglSurface == EGL_NO_SURFACE) {
     std::cerr << "创建 Pbuffer 表面失败 (Error: " << eglGetError() << ")" << std::endl;
     return;
 }
 std::cout << "EGLSurface: " << eglSurface << " (Error: " << eglGetError() << ")\n";

 // 绑定 OpenGL API
 eglBindAPI(EGL_OPENGL_API);
 std::cout << "API bound (Error: " << eglGetError() << ")\n";

 // 创建上下文
 EGLint contextAttribs[] = { EGL_CONTEXT_CLIENT_VERSION, 3, EGL_NONE };
 EGLContext eglContext = eglCreateContext(eglDpy, eglCfg, EGL_NO_CONTEXT, contextAttribs);
 if (eglContext == EGL_NO_CONTEXT) {
     std::cerr << "创建上下文失败 (Error: " << eglGetError() << ")" << std::endl;
     return;
 }

 // 使上下文当前
 if (eglMakeCurrent(eglDpy, eglSurface, eglSurface, eglContext) == EGL_FALSE) {
     std::cerr << "使 EGL 上下文当前失败 (Error: " << eglGetError() << ")" << std::endl;
     return;
 }

 std::cout << "EGL 上下文成功创建并绑定。" << std::endl;

error log

root@be70f1965008:/opt/cpp/StreamPusher/cvdemo# ./build/OpenGLFilterDemo 
19201080
EGLDisplay: 0x5629a845d070 (Error: 12288)
EGL Initialization failed (Error: 12289)
Segmentation fault (core dumped)
root@be70f1965008:/opt/cpp/StreamPusher/cvdemo# vainfo 
error: XDG_RUNTIME_DIR is invalid or not set in the environment.
error: can't connect to X server!
libva info: VA-API version 1.17.0
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so
libva info: Found init function __vaDriverInit_1_17
libva info: va_openDriver() returns 0
vainfo: VA-API version: 1.17 (libva 2.12.0)
vainfo: Driver version: Intel iHD driver for Intel(R) Gen Graphics - 23.1.1 ()
vainfo: Supported profile and entrypoints
      VAProfileNone                   : VAEntrypointVideoProc
      VAProfileNone                   : VAEntrypointStats
      VAProfileMPEG2Simple            : VAEntrypointVLD
      VAProfileMPEG2Main              : VAEntrypointVLD
      VAProfileH264Main               : VAEntrypointVLD
      VAProfileH264Main               : VAEntrypointEncSliceLP
      VAProfileH264High               : VAEntrypointVLD
      VAProfileH264High               : VAEntrypointEncSliceLP
      VAProfileJPEGBaseline           : VAEntrypointVLD
      VAProfileJPEGBaseline           : VAEntrypointEncPicture
      VAProfileH264ConstrainedBaseline: VAEntrypointVLD
      VAProfileH264ConstrainedBaseline: VAEntrypointEncSliceLP
      VAProfileHEVCMain               : VAEntrypointVLD
      VAProfileHEVCMain               : VAEntrypointEncSliceLP
      VAProfileHEVCMain10             : VAEntrypointVLD
      VAProfileHEVCMain10             : VAEntrypointEncSliceLP
      VAProfileVP9Profile0            : VAEntrypointVLD
      VAProfileVP9Profile0            : VAEntrypointEncSliceLP
      VAProfileVP9Profile1            : VAEntrypointVLD
      VAProfileVP9Profile1            : VAEntrypointEncSliceLP
      VAProfileVP9Profile2            : VAEntrypointVLD
      VAProfileVP9Profile2            : VAEntrypointEncSliceLP
      VAProfileVP9Profile3            : VAEntrypointVLD
      VAProfileVP9Profile3            : VAEntrypointEncSliceLP
      VAProfileHEVCMain12             : VAEntrypointVLD
      VAProfileHEVCMain422_10         : VAEntrypointVLD
      VAProfileHEVCMain422_12         : VAEntrypointVLD
      VAProfileHEVCMain444            : VAEntrypointVLD
      VAProfileHEVCMain444            : VAEntrypointEncSliceLP
      VAProfileHEVCMain444_10         : VAEntrypointVLD
      VAProfileHEVCMain444_10         : VAEntrypointEncSliceLP
      VAProfileHEVCMain444_12         : VAEntrypointVLD
      VAProfileHEVCSccMain            : VAEntrypointVLD
      VAProfileHEVCSccMain            : VAEntrypointEncSliceLP
      VAProfileHEVCSccMain10          : VAEntrypointVLD
      VAProfileHEVCSccMain10          : VAEntrypointEncSliceLP
      VAProfileHEVCSccMain444         : VAEntrypointVLD
      VAProfileHEVCSccMain444         : VAEntrypointEncSliceLP
      VAProfileAV1Profile0            : VAEntrypointVLD
      VAProfileHEVCSccMain444_10      : VAEntrypointVLD
      VAProfileHEVCSccMain444_10      : VAEntrypointEncSliceLP
root@be70f1965008:/opt/cpp/StreamPusher/cvdemo# 

From egl.h:

  • 12288 = 0x3000 = EGL_SUCCESS
  • 12289 = 0x3001 = EGL_NOT_INITIALIZED

So your egInitialize() call is failing.

This probably has something to do with it. Suggest you fix that, and then retry.

Thank you for your answer. I don’t think this is the problem, because I am calling it from within docker, and I can call egl correctly using python code in the same environment.

Hello, I have solved it. I can successfully initialize egl by specifying this environment variable, export EGL_PLATFORM=surfaceless

Nice find!