eglGetDisplay returns EGL_NO_DISPLAY

When i call eglGetDisplay, it returns EGL_NO_DISPLAY.

The argument I am passing it is EGL_DEFAULT_DISPLAY.

I am running this on Ubuntu Linux 18.04 on an x64 desktop machine.

How can I root cause this problem?

I am using the x86_64 version of libEGL.so.1 obtained from the Ubuntu p435.2ackage manager.

I am using the Nvidia drivers version 435.21.

So what do you get if you run these two commands from your shell:

echo $DISPLAY
glxinfo | grep ':'

:1

name of display: :1
display: :1 screen: 0
direct rendering: Yes
server glx vendor string: NVIDIA Corporation
server glx version string: 1.4
server glx extensions:
client glx vendor string: NVIDIA Corporation
client glx version string: 1.4
client glx extensions:
GLX version: 1.4
GLX extensions:
Memory info (GL_NVX_gpu_memory_info):
Dedicated video memory: 2048 MB
Total available memory: 2048 MB
Currently available dedicated video memory: 1244 MB
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GT 1030/PCIe/SSE2
OpenGL core profile version string: 4.6.0 NVIDIA 435.21
OpenGL core profile shading language version string: 4.60 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 4.6.0 NVIDIA 435.21
OpenGL shading language version string: 4.60 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 435.21
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
OpenGL ES profile extensions:
215 GLXFBConfigs:

Looks pretty much the same as mine here (running NVIDIA GL drivers on an NVIDIA GPU). And here, EGL works fine, albeit on display :0 instead of :1.

Are you first getting a non-null handle from XOpenDisplay()?

Are you then passing that display handle to eglGetDisplay()?

Is your :1 display mapped to an actual GPU video output, and not a virtual display?

I am not getting a non-null handle from XOpenDisplay().

I simply pass EGL_DEFAULT_DISPLAY to eglGetDisplay(). I think EGL_DEFAULT_DISPLAY is a null handle.

This works on some embedded devices that I am using but not on my Linux desktop. I don’t understand why.

How do I discern whether :1 is mapped to an actual GPU video output vs. a virtual display?

Ok. Sorry. I’d slept and forgotten that you’d mentioned that in your original post.

I looked back at some test code where I’d done exactly that (created an EGL context using EGL_DEFAULT_DISPLAY for rendering to an on-screen PBuffer), and there’s a trick to get this to work.

The EGL/OpenGL libs you’re linked with must be able to open kernel “special” files related to the graphics driver such as /dev/nvidiactl, /dev/nvidia0, /dev/nvidia-modeset, /dev/dri/card0, and /dev/dri/renderD128. On my Linux system, these are owned by group video. So by adding my username to the video group, the EGL/OpenGL libs running as me were then able to open these files and successfully create an EGL context, even in cases where I was not the user who had started the X server.

Once your username has the required access, then this should work:

    EGLDisplay display = eglGetDisplay( EGL_DEFAULT_DISPLAY );
    assert( display != EGL_NO_DISPLAY );

    EGLint egl_major, egl_minor;
    EGLBoolean success = eglInitialize( display, &egl_major, &egl_minor );
    assert( success );

Related thread:

On my system, those special files are owned by root.

I added my username to root and tried it again but it did not help.

I still get EGL_NO_DISPLAY.

Ok. Let’s try this first with a connection to the X server. Try this:

    Display *display = XOpenDisplay( ":0" );
    assert( display )

    EGLNativeDisplayType nativeDisplay = display;

    display = eglGetDisplay( nativeDisplay );
    assert( display != EGL_NO_DISPLAY );

Log into your X server, and run this app in a shell window running on your X desktop.

I am a total noob.

How do I log into my x server?

That’s OK. If you sit down at a monitor connected to the physical machine by a video cable, and you log in (enter your username and password) on a graphical login screen, and then a graphical desktop is displayed, then you’re most likely logging into the X server.

Your glxinfo suggests you are probably doing exactly this.

Just try running an app with the above code snippet from a shell window and see if it works for you.

That test program worked.

Also this test program also worked:

#include
#include <dlfcn.h>
#include <EGL/egl.h>
#include
#include <X11/Xlib.h>

int main()
{
    if (void* egl_lib = dlopen ("libEGL.so.1", RTLD_LAZY)) {
        std::cout << "successfully opened egl" << std::endl;

        PFNEGLGETDISPLAYPROC eglGetDisplay = reinterpret_cast <PFNEGLGETDISPLAYPROC> (dlsym (egl_lib, "eglGetDisplay"));
        if (eglGetDisplay) {
            Display* display = static_cast <Display*> (eglGetDisplay( EGL_DEFAULT_DISPLAY ));
            assert( display != EGL_NO_DISPLAY );
            std::cout << "successfully mapped egGetDisplay" << std::endl;
        } else {
            std::cout << "Failed to map eglGetDisplay" << std::endl;
        }

        dlclose (egl_lib);
    }
    return 0;
}

I don’t understand why it won’t work in my other piece of code.

One thing is that other piece of code is running inside a chroot so it may not have access to /dev.

Need to go back and check.

That worked.

The following also worked (no assert triggered):

    if (eglGetDisplay) {
        Display* display = static_cast <Display*> (eglGetDisplay( EGL_DEFAULT_DISPLAY ));
        assert( display != EGL_NO_DISPLAY );
    }

I don’t quite understand why it doesn’t work in my original app.

That app is running in a chroot so it could be that the chroot does not have access to /dev.

Need to go check that.

I’ll bet you’re onto something there.

Try running your app normally (i.e. not in a chroot), but running as a user other than the one that is logged into the X server (graphical login). That’s a good first step (and what I have working here using the EGL_DEFAULT_DISPLAY method, with no X server display connection required). Once that works, then it makes sense to look at what’s different between that and running in a chroot.

Just FYI, the way I debugged this when having similar problems was just running the EGL app under strace and then scanning the output to see what file and connection open calls are failing:

strace myapp >& OUT

(or if you’re a bash user:)

strace myapp > OUT 2>&1

Besides looking for the NVIDIA and X special files I listed above, also look for things like EACCES, EPERM, and other errors which indicate failure to gain access to a resource: