Thank you V-man for drawing attention to something that worked previously. With the first GL4.1 beta drivers NVIDIA released, binary shaders worked perfectly.
After first enthusiastic effort to find out how it works, I ceased to play with binary shaders. After your post I tried to revive (more than 3 months) old examples and … as you mentioned, it doesn’t work correctly with the latest drivers (260.99 on Vista 32-bit).
Thank you for reminding me about unofficial 261.00 drivers. I have tried them too, and, as expected, they behave identically.
I’ll try 262.99. They just arrived. But so far the first and last drivers that behaved correctly according to binary shaders were 259.09.
P.S. Oh, 262.99 are WHQL-certified drivers for the GTX580 only.
I’m also on Vista 32 bit and I was using 260.99 (available to the public). glGetProgramiv(m_ID, GL_LINK_STATUS never crashed on me. The only important thing was the binary format so that I could “GET” from GL and “SET” to GL the binary shader.
0x8E21 is the magic number!
I am still coding. I will soon see how fast launching my app will be.
The queries for GL_NUM_PROGRAM_BINARY_FORMATS and GL_PROGRAM_BINARY_FORMATS are currently broken in the latest NVIDIA drivers. This will be fixed with new drivers coming out soon. In the meantime you can assume GL_NUM_PROGRAM_BINARY_FORMATS returns 1 and GL_PROGRAM_BINARY_FORMATS returns 0x8E21 as you correctly figured out. Sorry for this inconvenience.
No, I meant loading binary shaders from the first (beta) version that supported them. Of course that it should succeed for the same driver version. But in the case of incompatibility issues, linking should not succeed and that is a signal to recompile shaders. Well, glGetProgramiv crushes if old binary shaders are tried to be linked (much shorter record than in the current version), on NV 8600GT/8600M GT/9600GT with both XP and Vista 32-bit. But it works (somehow) on GTX470.
Newer mind, I’ve just asked. I’ll try new 265.90 drivers.