OpenGL version is forced to 1.4, but card can support 4.6.0

Hello,
I got a very strange problem when using VcXsrv to display gui program on linux server.
Both computer has modern NVIDIA card,
But wen I run
$ glxinfo | grep ‘:’
I got:

name of display: 10.80.2.22:1
display: 10.80.2.22:1 screen: 0
direct rendering: No (LIBGL_ALWAYS_INDIRECT set)
server glx vendor string: SGI
server glx version string: 1.4
server glx extensions:
client glx vendor string: Mesa Project and SGI
client glx version string: 1.4
client glx extensions:
GLX version: 1.4
GLX extensions:
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GTX 670/PCIe/SSE2
OpenGL version string: 1.4 (4.6.0 NVIDIA 441.08)
OpenGL extensions:
559 GLXFBConfigs:

Why OpenGL version is limited to 1.4, BUT actually my card is 4.6
Could some one help me to solve this?
My program needs at least version 4.5 T_T

This is why. Indirect GLX can’t support more than 1.4. Later versions introduced memory-mapped buffers which require the client and server to be able to share address space.

Because you (or the environment you’re running in) is forcing your application to use indirect rendering, which tends to limit the supported OpenGL version to 1.4. See the first couple lines in your output above.

Not only is indirect rendering being forced (LIBGL_ALWAYS_INDIRECT), but your $DISPLAY setting (10.80.2.22:1) should force that as well, as it is requesting potentially remote display of the content on another machine running an X server.

  • Indirect rendering means the app (client) and the graphics driver (on the server) communicate GL via a serialized packet protocol suitable for sending over the network.
  • “Direct” rendering provides for a more direct path for GL commands to be communicated from your app to the graphics driver.

To get full OpenGL support, you’ll want to use “direct” rendering. Direct rendering supports the latest OpenGL versions, while Indirect rendering fell out of vogue years ago and does not.

What exactly is your setup? VcXsrv, is apparently an X Server running on Windows, yet you said you’re displaying on a Linux server. Which machine is your program physically running on and which machine is your program trying to display on? My guess:

  • Box A: Windows running VcXsrv and your application
  • Box B: Linux box (10.80.2.22) which you are trying to display your application’s output on

???

Thank you!
My program is running on Linux(Box B) which has GPU card but no display device. However, this programe needs opengl 4.5 to compute something.
Windows(Box A) has screen but can’t run the code.

First, I tried MobaXterm on windows to remotely run the code on Linux, but I only got GL version 3.0 ( can’t recognize either of the video card ).
Then I tried VcXsrv, It also makes the windows to be Xserver, and the linux to be client. I also tried the default DIRECT setting, but this can’t recognize the video card and got GL version 3.0.

name of display: 10.80.2.22:0
display: 10.80.2.22:0 screen: 0
direct rendering: Yes
server glx vendor string: SGI
server glx version string: 1.4
server glx extensions:
client glx vendor string: Mesa Project and SGI
client glx version string: 1.4
client glx extensions:
GLX version: 1.4
GLX extensions:
Extended renderer info (GLX_MESA_query_renderer):
Vendor: VMware, Inc. (0xffffffff)
Device: llvmpipe (LLVM 8.0, 256 bits) (0xffffffff)
Version: 19.0.8
Accelerated: no
Video memory: 7955MB
Unified memory: no
Preferred profile: core (0x1)
Max core profile version: 3.3
Max compat profile version: 3.1
Max GLES1 profile version: 1.1
Max GLES[23] profile version: 3.0
OpenGL vendor string: VMware, Inc.
OpenGL renderer string: llvmpipe (LLVM 8.0, 256 bits)
OpenGL version string: 3.1 Mesa 19.0.8
OpenGL shading language version string: 1.40
OpenGL context flags: (none)
OpenGL extensions:
64 GLXFBConfigs:

So I then use the INDIRECT setting, I could successfully recognized the video card on windows, but got GLversion 1.4…

——————————————————————————
How can I make this code run? with GL 4.5 and Linux without screen?

Why no display device?

Ok, why can’t it run your program?

Instead, try one of these to ensure the fastest path to the local X server:

glxinfo -display :0  
glxinfo -display :0.0

The presence of “Mesa” and “VMWare” here indicates that you are not connecting directly to the NVidia graphics drivers installed on this box (if in fact they are even installed). Are you trying to run this app in a VMWare virtual machine?

If the NVidia graphics drivers aren’t installed, I’d suggest you install them.

See this link:

From the article:

Alternatively, make sure an X server is running on the box (and NVidia graphics drivers are installed) and connect directly to the X server using a DISPLAY that supports direct mode (e.g :0 or :0.0).

Box B is an remote server and doesn’t has a screen…
The program needs opengl version 4.5 to run, but so far I can only get the gl version 3.1 on the Box B

and when I execute this command on Box B:
$ glxinfo -display :0 or 0.0

No protocol specified
Error: unable to open display :0

and the NVidia graphics drivers are installed:

$ nvidia-smi

Mon Nov 18 19:31:50 2019
±----------------------------------------------------------------------------+
| NVIDIA-SMI 410.93 Driver Version: 410.93 CUDA Version: 10.0 |
|-------------------------------±---------------------±---------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 GeForce GTX 1080 On | 00000000:02:00.0 Off | N/A |
| 50% 75C P2 60W / 180W | 6575MiB / 8119MiB | 0% Default |

Is it impossable to have GL version > 3.2 without a screen?

Ok. That indicates that your application doesn’t have permission to access the X server running on box B, or possibly that there isn’t one running on it. You can try ps aux | grep X, just to confirm it’s the latter.

You haven’t answered my other questions:

  • Why can’t Box A (Windows) run your code?
  • Are you trying to run your app in a VMWare virtual machine on Box B?

Have you tried this link I suggested?:

This allows you to run OpenGL applications and connect to the NVidia GL driver without a X server running on the box (e.g. a system without a screen).

  • 1
~$ ps aux | grep X
AD\zpt+    438  0.0  0.0  15960   940 pts/6    S+   08:26   0:00 grep --color=auto X
root      1481 15.9  0.0 529488 56440 tty7     Ssl+ 9/29 11687:57 /usr/lib/xorg/Xorg -core :0 -seat seat0 -auth /var/run/lightdm/root/:0 -nolisten tcp vt7 -novtswitch
AD\zpt+  11607  0.2  0.0 194444 46644 ?        S    9/29  169:18 Xvnc4 :1 -desktop cq-yjy-192-168-10-17:1 (AD\zpt19) -auth /home/zpt19/.Xauthority -geometry 1800x1000 -depth 16 -rfbauth /home/zpt19/.vnc/passwd -rfbport 5901 -pn -fp /usr/X11R6/lib/X11/fonts/Type1/,/usr/X11R6/lib/X11/fonts/Speedo/,/usr/X11R6/lib/X11/fonts/misc/,/usr/X11R6/lib/X11/fonts/75dpi/,/usr/onts/100dpi/,/usr/share/fonts/X11/misc/,/usr/share/fonts/X11/Type1/,/usr/share/fonts/X11/75dpi/,/usr/share/fonts/X11/100dpi/ -co /etc/X11/rgb
AD\zpt+  15097  0.0  0.0  22956  5576 ?        S    11/15   0:00 Xvnc4 :5 -desktop cq-yjy-192-168-10-17:5 (AD\zpt25) -auth /home/zpt25/.Xauthority -geometry 1024x768 -depth 16 -rfbauth /home/zpt25/.vnc/passwd -rfbport 5905 -pn -fp /usr/X11R6/lib/X11/fonts/Type1/,/usr/X11R6/lib/X11/fonts/Speedo/,/usr/X11R6/lib/X11/fonts/misc/,/usr/X11R6/lib/X11/fonts/75dpi/,/usr/onts/100dpi/,/usr/share/fonts/X11/misc/,/usr/share/fonts/X11/Type1/,/usr/share/fonts/X11/75dpi/,/usr/share/fonts/X11/100dpi/ -co /etc/X11/rgb
root     18752  0.0  0.0  22956  2420 ?        S    10/22   0:00 Xvnc4 :3 -desktop cq-yjy-192-168-10-17:3 (root) -auth /root/.Xauthority -geometry 1024x768 -depth 16 -rfbwait 30000 -rfbapasswd -rfbport 5903 -pn -fp /usr/X11R6/lib/X11/fonts/Type1/,/usr/X11R6/lib/X11/fonts/Speedo/,/usr/X11R6/lib/X11/fonts/misc/,/usr/X11R6/lib/X11/fonts/75dpi/,/usr/X11R6/lib/X11/fonts/100dponts/X11/misc/,/usr/share/fonts/X11/Type1/,/usr/share/fonts/X11/75dpi/,/usr/share/fonts/X11/100dpi/ -co /etc/X11/rgb
root     19159  0.0  0.0 150844 13120 ?        S    10/22   0:14 Xvnc4 :4 -desktop cq-yjy-192-168-10-17:4 (root) -auth /root/.Xauthority -geometry 1900x1000 -depth 16 -rfbwait 30000 -rfb/passwd -rfbport 5904 -pn -fp /usr/X11R6/lib/X11/fonts/Type1/,/usr/X11R6/lib/X11/fonts/Speedo/,/usr/X11R6/lib/X11/fonts/misc/,/usr/X11R6/lib/X11/fonts/75dpi/,/usr/X11R6/lib/X11/fonts/100dfonts/X11/misc/,/usr/share/fonts/X11/Type1/,/usr/share/fonts/X11/75dpi/,/usr/share/fonts/X11/100dpi/ -co /etc/X11/rgb
AD\zpt+  19841  0.0  0.0  22956  6680 ?        S    11/15   0:00 Xvnc4 :6 -desktop cq-yjy-192-168-10-17:6 (AD\zpt25) -auth /home/zpt25/.Xauthority -geometry 1024x768 -depth 16 -rfbauth /home/zpt25/.vnc/passwd -rfbport 5906 -pn -fp /usr/X11R6/lib/X11/fonts/Type1/,/usr/X11R6/lib/X11/fonts/Speedo/,/usr/X11R6/lib/X11/fonts/misc/,/usr/X11R6/lib/X11/fonts/75dpi/,/usr/onts/100dpi/,/usr/share/fonts/X11/misc/,/usr/share/fonts/X11/Type1/,/usr/share/fonts/X11/75dpi/,/usr/share/fonts/X11/100dpi/ -co /etc/X11/rgb
AD\zpt+  30719  0.0  0.0 112228  8528 ?        S    10/14   0:01 Xvnc4 :2 -desktop cq-yjy-192-168-10-17:2 (AD\zpt19) -auth /home/zpt19/.Xauthority -geometry 1920x1080 -depth 16 -rfbauth /home/zpt19/.vnc/passwd -rfbport 5902 -pn -fp /usr/X11R6/lib/X11/fonts/Type1/,/usr/X11R6/lib/X11/fonts/Speedo/,/usr/X11R6/lib/X11/fonts/misc/,/usr/X11R6/lib/X11/fonts/75dpi/,/usrfonts/100dpi/,/usr/share/fonts/X11/misc/,/usr/share/fonts/X11/Type1/,/usr/share/fonts/X11/75dpi/,/usr/share/fonts/X11/100dpi/ -co /etc/X11/rgb
AD\zpt+  31353  0.0  0.0  22956  7292 ?        S    11/18   0:00 Xvnc4 :7 -desktop cq-yjy-192-168-10-17:7 (AD\zpt25) -auth /home/zpt25/.Xauthority -geometry 1024x768 -depth 16 -rfbauth /home/zpt25/.vnc/passwd -rfbport 5907 -pn -fp /usr/X11R6/lib/X11/fonts/Type1/,/usr/X11R6/lib/X11/fonts/Speedo/,/usr/X11R6/lib/X11/fonts/misc/,/usr/X11R6/lib/X11/fonts/75dpi/,/usr/onts/100dpi/,/usr/share/fonts/X11/misc/,/usr/share/fonts/X11/Type1/,/usr/share/fonts/X11/75dpi/,/usr/share/fonts/X11/100dpi/ -co /etc/X11/rgb
  • 2
  • Why can’t Box A (Windows) run your code?

Because Box A has poor hardware performance, the code can merely run in simple cases.

  • Are you trying to run your app in a VMWare virtual machine on Box B?

Box B is an linux server installed as Ubuntu 16.04. it’s not a virtual machine, but doesn’t has screen.

  • 3

I have tried your link, but the instructions seems too brief for me, is there much work to do for me to change the code frome glfw3 + glad to egl?
Anyway, when I test the egl example code, It gose wrong.

int main(int argc, char *argv[])
{
  // 1. Initialize EGL
  EGLDisplay eglDpy = eglGetDisplay(EGL_DEFAULT_DISPLAY);
  *// **the above returns :EGL_NO_DISPLAY***

  EGLint major, minor;
  eglInitialize(eglDpy, &major, &minor);
  // 2. Select an appropriate configuration
  EGLint numConfigs;
  EGLConfig eglCfg;

  eglChooseConfig(eglDpy, configAttribs, &eglCfg, 1, &numConfigs);

  // 3. Create a surface
  EGLSurface eglSurf = eglCreatePbufferSurface(eglDpy, eglCfg, 
                                               pbufferAttribs);

  // 4. Bind the API
  eglBindAPI(EGL_OPENGL_API);

  // 5. Create a context and make it current
  EGLContext eglCtx = eglCreateContext(eglDpy, eglCfg, EGL_NO_CONTEXT,  NULL);
 // **and after the above code,   eglGetError() != EGL_SUCCESS**
// **I think that means it can't correctly create the gl context.**

  eglMakeCurrent(eglDpy, eglSurf, eglSurf, eglCtx);
  // from now on use your OpenGL context
  // 6. Terminate EGL when finished
  eglTerminate(eglDpy);
  return 0;
}
  • 4

Then I tried the multiple GPUs example. This time the GL context create function doesn’t cause Error.

EGLDisplay eglDpy = eglGetPlatformDisplayEXT(EGL_PLATFORM_DEVICE_EXT, eglDevs[0], 0);

but I can’t use the GL fuctions:
(my progect is originally based on glfw3 and glad )

gladLoadGL()

or

glfwInit()
gladLoadGL()

gladLoadGL() just returned error and glGetString(GL_VERSION) == NULL

Hello!
I just have tested the above code on my local linux system( which has a screen), It gose well, and after

gladLoadGL() // this return true on my local ubuntu

I got GLVersion.major = 4 and GLVersion.minor=6
But this code just CANNOT work on my ubuntu server.

eglGetPlatformDisplayEXT() // returns true
gladLoadGL() //BUT this returns false

in this function,
glGetString = (PFNGLGETSTRINGPROC)load(“glGetString”);
I got glGetString != NULL
BUT glGetString(GL_VERSION) == NULL

  • so do I must have a screen to run this,
  • Or my server lack something?

the following is info on my local linux

$ ps aux |grep X
root      1167  0.2  0.4 333916 65416 tty1     Sl+  11:44   0:00 /usr/lib/xorg/Xorg vt1 -displayfd 3 -auth /run/user/120/gdm/Xauthority -background none -noreset -keeptty -verbose 3
root      1456  4.5  0.5 432384 89536 tty2     Sl+  11:44   0:16 /usr/lib/xorg/Xorg vt2 -displayfd 3 -auth /run/user/1000/gdm/Xauthority -background none -noreset -keeptty -verbose 3
azure       3158  0.0  0.0  21536  1104 pts/0    S+   11:50   0:00 grep --color=auto X

I saw this is much different from my server’s above.
I’m not a expert on this, could you please tell me what’s wrong with my server and how to fix this ?

Overall, the two options that come to mind:

  1. NVidia offscreen GL without X server
  2. Mesa3D’s software rasterizer.

#1 should use the local GPU (…if you can get it to work; so far, no luck).
#2 won’t use the GPU, but shouldn’t require an X server.

Now as to #1… I took some time and tried this out on a Linux box at home (with an NVidia GPU and NVidia GL drivers), partly because you were having trouble with it.

In my testing, I used EGL to create a context (OpenGL or OpenGL ES) backed with a PBuffer offscreen surface. Like you, I tried two methods for obtaining the display described on the page:

  1. eglGetDisplay( EGL_DEFAULT_DISPLAY )
  2. eglQueryDevicesEXT() + eglGetPlatformDisplayEXT()

I also tried testing the various permutations:

  • (a) - Running as the user that owns the X display (with $DISPLAY = :0)
  • (b) - Running as the same user with no $DISPLAY set.
  • (c) - Running as a “different” user with no $DISPLAY set.
  • (d) - Running as a diff user with $DISPLAY = :0 and no X cookie added
  • (e) - Running as a diff user with $DISPLAY = :0, X cookie from the display added (for access perms)

You probably only care about (c), with either method for obtaining an EGLDisplay. Unfortunately, I wasn’t able to get those to work.

Summary: The EGL method of creating an OpenGL or OpenGL ES GL context seems to work great so long as you have a direct access to the X server. But without that, it fails. Could be I’m not doing something quite right, or there’s been a regression in this functionality.

You may want to post about your problem trying to use GL without an X server using NVidia’s drivers and instructions on one of these NVidia developer forums:

For completeness, here are the results I got:

  • 1a. Works great!

    EGL_VERSION     = 1.5
    EGL_VENDOR      = NVIDIA
    EGL_CLIENT_APIS = OpenGL_ES OpenGL
    GL_VENDOR       = NVIDIA Corporation
    GL_RENDERER     = GeForce GTX 1650/PCIe/SSE2
    GL_VERSION      = 4.6.0 NVIDIA 430.14
    GL_SHADING_LANGUAGE_VERSION = 4.60 NVIDIA
    PBuffer Size    = 320 x 320
    PBuffer Samples = 4
    
  • 1b. Works great! (same as 1a)

  • 1c. ERROR: eglInitialize() failed: err = 0x3001 (EGL_NOT_INITIALIZED)

  • 1d.

    Invalid MIT-MAGIC-COOKIE-1 keyNo protocol specified
    Invalid MIT-MAGIC-COOKIE-1 keyNo protocol specified
    Invalid MIT-MAGIC-COOKIE-1 keyNo protocol specified
    Invalid MIT-MAGIC-COOKIE-1 keyNo protocol specified
    ERROR: eglInitialize() failed: err = 0x3001  (EGL_NOT_INITIALIZED).
    
  • 1e. Works, but seems to fall back down to OpenGL 3.0 and use Mesa3D’s software rasterizer, not the NVidia GL driver.

     libEGL warning: DRI2: failed to authenticate
     EGL_VERSION     = 1.4 (DRI2)
     EGL_VENDOR      = Mesa Project
     EGL_CLIENT_APIS = OpenGL OpenGL_ES 
     GL_VENDOR       = VMware, Inc.
     GL_RENDERER     = llvmpipe (LLVM 5.0, 256 bits)
     GL_VERSION      = 3.0 Mesa 18.0.2
     GL_SHADING_LANGUAGE_VERSION = 1.30
     PBuffer Size    = 320 x 320
     PBuffer Samples = 4
    
  • 2a. Works great (like 1a)

  • 2b. Works great (like 1a)

  • 2c. Fails. eglQueryDevicesEXT() doesn’t see any devices.

  • 2d. -

  • 2e. Fails (same as 2c)

Thank you very much for spending so much time and concern.
Considering the result I got before, I think this may be the problem with the graphic card drivers or something.
In my case, EGL didn’t throw anything wrong even till eglMakeCurrent(), but gladLoadGL() just can’t work. I’ll try on some other computer.
Thank you.

@azuresilent, I figured out what the issue was with trying to render via an NVidia GPU without X server access, and it works great now:

Basically, add the user you want to be able to have this access to the “video” group (in /etc/group).

The reason is, in order for the EGL+GL init to succeed for the NVidia path, the app must be able to open kernel special files associated with the NVidia driver which are owned by group “video”. Special files such as: /dev/nvidia0, /dev/nvidiactl, /dev/nvidia-modeset, /dev/dri/card0, and /dev/dri/renderD128.

On a hunch, I took that test app I was trying to run as a user which didn’t have X server access perms and captured an strace of it while running as this user. Sifting through this, it was pretty obvious that it was failing to open some key NVidia-driver related files, and then falling back to other EGL providers via libglvnd.

Once I added that user to group video (and logged that user out and back in to pick up the group change), then that user was able to create full OpenGL 4.6 capable GL contexts without a connection to the X server, all via EGL:

Created OpenGL 4.x context
EGL_VERSION     = 1.5
EGL_VENDOR      = NVIDIA
EGL_CLIENT_APIS = OpenGL_ES OpenGL
GL_VENDOR       = NVIDIA Corporation
GL_RENDERER     = GeForce GTX 1650/PCIe/SSE2
GL_VERSION      = 4.6.0 NVIDIA 430.14
GL_SHADING_LANGUAGE_VERSION = 4.60 NVIDIA
PBuffer Size    = 320 x 320
PBuffer Samples = 4

If you want some test code, just let me know. Though it’s pretty much just what’s described on:

Thank you very very much!
With your valuable help, I have solved the problem.
First, your link: EGL Eye: OpenGL Visualization without an X Server is fundamental. This allows we to run OpenGL applications off-screen thus without Xserver or screen. And is easy to use.

This gose well on my local PC, but when I test the code on my server, the EGL functions all gose well, but the gl context created by egl just can’t load the gl functions.
So I thought this may dut to the gpu drivers or something low.
So I installed docker on my server, and reinstall the nvidia driver in the container (based on nvidia-cuda image), after this, all things gose well in the container!

Again, Thank you!

Excellent! Thanks for following-up with your solution!

Hmm, on a quick look gladLoadGL() uses glXGetProcAddr() to load the GL function pointers - I don’t know a lot about EGL, but wouldn’t it have to use eglGetProcAddress() to work correctly under a Xserver-less setup?

This may works, but my project is based on glad and difficult to change.
EGL is the key to Xserver-less setup, this solve my first problem, but then I found my egl doesn’t work correctly although without error returns, this can’t blame the glad. So I used docker to creat a clean environment, and the problem solved.