The extension is supported when I check for it but when I call
glGetIntegerv(0x87FE/GL_NUM_PROGRAM_BINARY_FORMATS/, &numberOfBinaryFormats);

it tosses GL_INVALID_ENUM.
Do I have to make a GL 3.3 context?
I am only making a GL 2.1 context.

//EDIT : ok, I made a GL 3.3 context with WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB and I’m still getting the same [censored].

The extension is supported, so the call should be valid. So this is clearly a driver bug.

int u=10;

doesn’t write a value to my u variable and fails.

glGetProgramiv(HProgramObject, GL_PROGRAM_BINARY_LENGTH, &binaryLength);
works also.

glGetProgramBinary works also.

And as for reloading the binary :
glProgramBinary works but the link status says “unknown program binary”.

Someone on his webpage says he gets 0x8E21 so that’s what I put and it works for me.
I guess I’ll forget about

for now.
Hope nVidia fixes it.

Thank you V-man for drawing attention to something that worked previously. With the first GL4.1 beta drivers NVIDIA released, binary shaders worked perfectly.

After first enthusiastic effort to find out how it works, I ceased to play with binary shaders. After your post I tried to revive (more than 3 months) old examples and … as you mentioned, it doesn’t work correctly with the latest drivers (260.99 on Vista 32-bit).

  1. glGetIntegerv(GL_NUM_PROGRAM_BINARY_FORMATS, &numBinaryFormats) leaves numBinaryFormats uninitialized (unchanged).

  2. The more serious error is generated by glGetProgramiv(m_ID, GL_LINK_STATUS, &success). It crushes the application if the version of the binary shader is adequate.

could you try the 260.93 or 261.00 drivers? They come with a newer OpenGL 4.1 implementation from Nvidia.




Thank you for reminding me about unofficial 261.00 drivers. I have tried them too, and, as expected, they behave identically.
I’ll try 262.99. They just arrived. But so far the first and last drivers that behaved correctly according to binary shaders were 259.09.

P.S. Oh, 262.99 are WHQL-certified drivers for the GTX580 only. :frowning:

You can get them to install on the 400 line by adding the following lines to the corresponding sections in the nv_disp.inf file.

%NVIDIA_DEV.06C0.01% = Section012, PCI\VEN_10DE&DEV_06C0
%NVIDIA_DEV.06C4.01% = Section012, PCI\VEN_10DE&DEV_06C4
%NVIDIA_DEV.06CD.01% = Section012, PCI\VEN_10DE&DEV_06CD

DiskID1 = "NVIDIA Windows Vista / Windows 7 (64 bit) Driver Library Installation Disk 1"
NVIDIA_DEV.06C0.01 = "NVIDIA GeForce GTX 480"
NVIDIA_DEV.06C4.01 = "NVIDIA GeForce GTX 465"
NVIDIA_DEV.06CD.01 = "NVIDIA GeForce GTX 470"

OK! Thank you!
But I really doubt they change anything but adding a support for GTX580.

Thanks anyway!

I’m also on Vista 32 bit and I was using 260.99 (available to the public). glGetProgramiv(m_ID, GL_LINK_STATUS never crashed on me. The only important thing was the binary format so that I could “GET” from GL and “SET” to GL the binary shader.
0x8E21 is the magic number!

I am still coding. I will soon see how fast launching my app will be.

The queries for GL_NUM_PROGRAM_BINARY_FORMATS and GL_PROGRAM_BINARY_FORMATS are currently broken in the latest NVIDIA drivers. This will be fixed with new drivers coming out soon. In the meantime you can assume GL_NUM_PROGRAM_BINARY_FORMATS returns 1 and GL_PROGRAM_BINARY_FORMATS returns 0x8E21 as you correctly figured out. Sorry for this inconvenience.

Anyone knows when this will be fixed ?

I’ve just installed 263.06 and everything stays the same.
Assume GL_NUM_PROGRAM_BINARY_FORMATS is 1 and keep going! :wink:

Does glGetProgramiv(m_ID, GL_LINK_STATUS, &success) crush on your configuration if you are using old binary shaders?

Do you mean binary shaders from your previous driver?

If you mean the same driver, I don’t have any crashes loading a binary and calling glGetProgramiv(m_ID, GL_LINK_STATUS, &success).

There are again newer drivers (265.90 Quadro). They again come with a newer OpenGL 4.1 implementation from Nvidia.


To install them on GeForce cards you can use the same solution as above.


No, I meant loading binary shaders from the first (beta) version that supported them. Of course that it should succeed for the same driver version. But in the case of incompatibility issues, linking should not succeed and that is a signal to recompile shaders. Well, glGetProgramiv crushes if old binary shaders are tried to be linked (much shorter record than in the current version), on NV 8600GT/8600M GT/9600GT with both XP and Vista 32-bit. But it works (somehow) on GTX470.

Newer mind, I’ve just asked. I’ll try new 265.90 drivers.

The queries for GL_NUM_PROGRAM_BINARY_FORMATS and GL_PROGRAM_BINARY_FORMATS are fixed up in 265.90 drivers.
The problem with GL_LINK_STATUS on 8xxx/9xxx cards still persists.