One important thing to start with is an installable OpenGL.dll that either comes with an SDK or ICD driver, because as everyone knows OpenGL32.DLL is already aged. Same as we have in linux!
Okay now on the schedule for the glory of GL 3.5
Deprecated part of GL should either be implemented on top of D3D or current spec core, by ARB. THIS IS A MUST to relief IHVs and let them focus on core spec implementation.
Create OpenGL.dll which incorporates above plus the core spec functionality without having to use extensions to get core funcs for god sake.
What’s wrong with the current method of extensions?
This does not seem to be within the competence of ARB. I bet they can do nothing about it.
Nothing wrong with extensions
I’m talking about something like libGL.so that exposes the core functionality of version spec without the need to use extensions. This is true under linux.
ARB or platform creator can provide this GL lib, since it’s part of system integration.
Otherwise let the IHV provide one for Windows as they provide one for linux.
The more I think about this the less feasible it seems.
Isn’t it easier to layer such a library over the existing opengl32.dll? Something like GLFW+GLEW (without deprecated features and obsolete extensions) supplied with an application, not a driver. Why to change something that has been working for over 10 years?
Ok, a couple of things.
The method Linux/Solaris uses is even more flawed than Windows’. With just a single libGL.so file, you cannot have 2 cards that require different implementations of it, for example if you have an NVIDIA and ATI card. If you just want an updated version of OpenGL32.dll, talk to Microsoft as they control it.
I actually looked into replacing it, but it seems that the drivers call back into OpenGL32.dll for some reason. I have code that will find the correct driver for the current screen, but the initialisation always fails.
It is legal to implement drivers in such way, that calling GetProcAddress(“glFoo”) in two different contexts can return two different (non-NULL) addresses.
Check http://oss.sgi.com/projects/ogl-sample/GLsdk.zip to see what kind of hoops you’d have to jump through, if you cared to have 100% legal extension handling in your code.
Yep, the docs on wglGetProcAddress only guarantee uniqueness for extensions of particular pixel format.
Can’t say as I’ve ever run into a problem with this, but regardless of how this pans out in practice it’s not too difficult to hoist extensions into a per-context container of some sort and be done with it.
- Dummy context for ogl3.
- Extension mechanism must be useful for extensions, but not core futures. Core futures must be initialise like nix.
p.s. sry 4 bad eng…
What’s wrong with the current method of extensions that we can actually do something about?
Nobody likes having to create a dummy context, but that’s just tough cookies.
Extensions are great! Nothing wrong with that.
However, having to load the core API through extensions is wrong!!!
Why? Because ppl will still think it’s not core, it’s extension
I appreciate the Linux (NVIDIA) implementation, where the libGL.so exposes the full core functionality.
OpenGL32.dll is simply outdated…it stinks and rusty.
Please read the OpenGL Linux ABI. Or search the forum. There is a reason why statically linking to the full core functionality is not a good idea, and anyone who actually tried shipping an application to actual customers with unknown hardware specs should know why.
I think this misinformation is posted about 2 times a week here, I’m tired of correcting it…
I still cannot see where the prob is. If the HW supports GL version x.y, then why is not good idea to expose the core functionality without having to resort to extension? Can you give me some examples please?
I still cannot see where the prob is.
Neither can we.
Frankly anything other than the way it currently is would stymie my work flow.
I still cannot see where the prob is.
That you cannot see a problem does not mean that the problem does not exist. Any professional developer can easily handle loading function pointers manually. Any amateur developer can easily handle using GLEW or GLee.
Loading function pointers is the absolute last thing the ARB should be concerned about.
Yes sir, it’s pretty trivial. However, my question was not complaining about the burden of dynamically linking to API calls.
The current OpenGL32.lib is not capable of many features that HW may expose; Is it easy for IHVs to add certain window creation flags??? I dunno myself just asking. And I thought this DLL is outdated since the 3Dfx days…need some good change. And I’m not asking the ARB in particular to implement it, but rather pushing the platform creators to co-operate…maybe eleimnating the need for it, and make it IHV like in Linux
Now someone will say Linux GL lib has probl…etc. okay then answer my curiousity and tell me why it’s wrong the way it’s done in Linux?
Don’t mind about Glfreak… he’s our resident patron saint of lost causes and social engineer.