OpenGL 3.2 support in new nVidia linux beta driver

Website phoronix claims that the new nvidia beta drivers for linux have support for the unreleased OpenGL 3.2 standard. The change log of nvidia doesn’t say anything about the new spec, but one of the screenshots on Phoronix does indeed show that the nvidia panel says it supports version 3.2 of OpenGL.

Read it here:

Any thoughts? I was indeed expecting something for siggraph as is mentioned in the article, but didn’t expect to hear about OpenGL 3.2 for the first time like this. According to the article no new hardware is required for the new spec (which makes sense because OpenGL 3.0/3.1 already requires DirectX 10 class hardware… besides AMD’s DirectX 10.1 hardware there is nothing newer than that on the market yet).

I wonder which of the extensions will become requirements for OpenGL 3.2… perhaps they finaly include the geometry shader, because AMD has taken its first steps in their drivers towards support of them…

Well, it’s not like NVIDIA doesn’t sit on the ARB. They know what’s coming down the pipe. And they’re more proactive than ATI about OpenGL, so they’re probably planning on GL 3.2 support soon after the spec hits.

Check out the gl.spec file from the registry. It already mentions OpenGL 3.2 and lists new deprecated functions (including stuff introduced in 3.0, IIRC).

No new entry points at this time, but I’d expect new stuff at some point after September.

… or maybe some announcement in two weeks at the OpenGL BOF
at Siggraph :slight_smile:

Let’s hope they nail the lid down on SM4-4.1 as SM5 is just around the corner.

(Course I’m still stumbling around in 2.x land so what do I care. ;-))

Expecting something new and powerful.

Great news, though already depreciating functions from 3.0 is a bit annoying.

I currently have the Official OpenGL Programming guide for 3.0 and 3.1 on pre-order for release at the end of august on Amazon. Are they going to hold off print until the 3.2 spec is out? I’m going to be a bit miffed that I spent £35 on a new guide that contains new features that are already depreciated and doesn’t include the features of the current spec.

It would be nice to have some kind of Official response on this.

I currently have the Official OpenGL Programming guide for 3.0 and 3.1 on pre-order for release at the end of august on Amazon. Are they going to hold off print until the 3.2 spec is out? I’m going to be a bit miffed that I spent £35 on a new guide that contains new features that are already depreciated and doesn’t include the features of the current spec.

So, OpenGL should just stop getting new features and extensions until a book gets printed?

3 versions of GL within the same year has never happened before. If GL 3.2 makes a few books out of date, so be it.

So what extensions do you realistically expect to move into the core with OpenGL 3.2?

Personally I don’t see the bindless graphics extension moving into the core. It would be great if they did I guess, but it is a too drastic move which I don’t see happening soon. I think they might move the geometry shader into the core, which could be a logical step. On the other hand isn’t the geometry shader being replaced by newer DirectX 11 class hardware? Or does it have right of existence together with DirectX 11 capabilities?

What about creating binary blobs for glsl shaders so they can be precomputed? I think this will enter OpenGL/GLSL at some point, but could this be happening with OpenGL 3.2? Another thing I can think of is release the restriction of tight coupling between vertex shaders and fragment shaders. Separate vertex programs and fragment programs would allow more flexibility when using the programs together (and not having to link all possible permutations of combined programs).

One last thing I’d like to see (but which I’m not expecting tbh) are texture samplers that aren’t bound to a texture. This way you would not need multiple textures which are essentially the same, but only use different samplers.

At this point, these are all the major changes I can think of for OpenGL 3.2. But (except the bindless graphics) we haven’t seen any of these things in extensions yet, correct? And the beta driver of nvidia doesn’t show any new extensions that could cover these points either I believe. It could be that there won’t be any extensions for a new OpenGL spec and that all changes are just made to the core. Could also be that there are just some changes to GLSL plus some minor OpenGL changes such as certain texture formats moved into core.

Unfortunately I don’t have nvidia hardware that is capable of OpenGL 3.*, so I can’t play with the new beta driver.

I would surprised if anisotropic filtering doesn’t make it to core in 3.2. This extension has been supported by everyone for over a decade and it’s too damn useful.

It’s too early to tell what else will be included, but we might see shader binaries (it’s supported in OpenGL ES, so yeah). Other potential features are sampler states (I hope!), some form of tesselation (EXT probably, rather than core), and maybe, just maybe, some improved support for threading (either via display lists or some other form).

I would really love to see DSA too, but I doubt that’s possible in such a short timeframe.

AMD already has a tesselation extension available for quite a while now (unfortunately they have a 6 month delay for implementing a new OpenGL api, still no OpenGL 3.1 support…). Does the nvidia hardware even support tesselation? That is part of DirectX 10.1 right? Which isn’t supported by nvidia (at least not completely, but complete support is comming in September I believe with new hardware from nvidia).

Some form of threading would be great as well and is necessary to compete with DirectX 11 I’d say (though, I’m not sure if OpenGL should be aimed at competing with DirectX as its used in a broader range of different applications).

I would definitely expect GL_EXT_separate_shader_objects (mentioned here).

tesselation is part of DirectX 11.
No chance for this with OpenGL 3.2

I think we could expect GL_EXT_provoking_vertex, GL_EXT_texture_snorm, GL_EXT_vertex_array_bgra GL_EXT_texture_swizzle and maybe something related to GL_NV_explicit_multisample. On top of that the big new feature I expect is shader binary.

GL_NV_copy_image promoted to ARB, I expect this extension to be like GL_ARB_copy_buffer for images for more interoperability with OpenCL. I definitely expect nVidia OpenCL implementation released to public.

Other possible big thing: WGL_AMD_gpu_association and WGL_NV_gpu_affinity as an ARB extension … if they managed to agree!

I think geometry shader will stay as it is. I hope (be don’t realy believed in it) an updated DSA extension but not into core.

Different blend modes per render target might also appear (GL_AMD_draw_buffers_blend), as was suggested here:…5720#Post255720

But this is not compatible with nVidia cards yet …

According to the topic pointed to by Eosie it should be (unless it is a DirectX 10.1 function, in that case you are right, but the topic mentions just DirectX 10).

It’s a Direct3D 10.1, that’s why I said it’s not compatible with nVidia hardware yet.

Nvidia has some new OEM hardware that supports 10.1 (G210 & GT210). Beside of these the Direct3D 10.1feature level says all or nothing. Therefore it’s quite possible that a chip that doesn’t support Direct3D 10.1 still supports some of the features.

As far as I know the existing nVidia DirectX 10 hardware (not the G210 and GT210) supports some of the functions that are part of DirectX 10.1, but not all of them. I don’t know which functions are supported though.

I bet it’d be access to individual MSAA/CSAA samples. But this is something all nV cards since GF8x00 have.