OpenGL rebuilt from the ground up?

Hi,

I have been using OpenGL for a year now for research purposes and have just been tinkering around with DirectX due to their Direct Compute functionality.

What I am amazed at is the organization of the DirectX10/11 API.

Would it be possible for OpenGL to be rebuilt from scratch towards a more pattern like and intuitive way similar to DX10/11?

From the impressions I got from these extensions, they looked really tacked on. It would most likely be more efficient and easier to learn if OpenGL would be rebuilt from scratch.

Thank you for your time

Humm, a first post like this on April 1st makes me suspicious…

But anyway - This has been proposed a few times with OpenGL 2.0 and then OpenGL 3.0, but it does not look like it will happen in the near future. If you want DirectCompute functionality, you will need to use OpenCL.

The best you can do is just use a Pure OpenGL3.x/4.0 profile without any of the legacy functionality.

and when doing so the similarities to D3D10/11 are quite large. when DSA finds its way into the core OpenGL for me is again the very simple and very powerful API.

And anyway, the wheel of reincarnation has almost returned to software rendering. It’ll be less than a decade before stream processing and/or many-core processing is generic and ubiquitous enough for libraries like OpenGL and Direct3D to be real libraries, rather than interfaces to drivers. When that happens, you’ll be able to write your own graphics library and structure it any way you please.

Since rebuilding OpenGL from the ground up would take several years, and OpenGL 4 + DSA is Good Enough for the foreseeable future, it makes more sense for the ARB to focus on short-term gains.

It would most likely be more efficient and easier to learn if OpenGL would be rebuilt from scratch.

Well, considering that OpenGL is already more efficient as D3D (and even moreso if you delve into NVIDIA-specific extensions), I have to say that this claim is somewhat dubious.

The problem with OpenGL is the number of newbie traps that choke off performance. If you know what you’re doing, you can make it work just fine. If you don’t, then

Aside: outside of the specific need for DSA for actual functionality (multithreading, etc), why is “OpenGL 4 + DSA is Good Enough”? DSA doesn’t make the API more modern or anything; it just makes it different, outside of the aforementioned actual functionality. Notably, this functionality requires more than just the DSA extension; you can’t have multithreading without an extension that specifies the behavior of a multithreaded GL context.

And really, there’s nothing you want from your rendering API “for the foreseeable future”? No blend shaders, no REYES, nothing? I mean, not even separation of shaders? What, is everyone using ubershaders or some nonsense now? With 5 independent shader stages now, I don’t see how anyone can be satisfied with the potential combinatorial explosion of program objects.

Sorry for not articulating myself very well. When I said “OpenGL 4 + DSA is Good Enough”, I was referring to the organization of the framework, arguing that OpenGL doesn’t need to be reorganized, as the OP suggested. The extra features you mentioned (and I can think of more; blobs would be nice) are the kind of short-term gains that I think the ARB should focus on.

As far as DSA, it makes the library easier to work with, so from an organizational perspective, it’s a big win.

Actually, considering the OP was talking about the organization of DirectX and how the OpenGL extensions feel “tacked on”, I don’t think the OP meant “efficient” in the same way you did. OP was probably talking about programming efficiency, not machine efficiency.

Best April First EVER!

Oh Really? I’d say MEGA LAME!!

I don’t get it. Why does he have AA batteries in his car battery.
I once opened a few car batteries.

I’m for DSA or anything that would make driver craft easier so that no longer pink/black textures on some video cards exist :wink:

Remember that one of the reasons to revolutionize the API is to simplify implementation and hence more stable and reliable drivers.

Then you have been fooled.
http://www.joelonsoftware.com/articles/fog0000000069.html

I see what you mean…nice article.

I believe if OpenGL just trashed old (legacy) calls and built entirely on the new streamlined calls it would lose it’s market share which is only in CAD software arena. And this is due to the fact that those high-end programs have strong OpenGL code base that’s written against legacy calls.
If those developers were to re-write their rendering pipeline from scratch they would definitely choose a different API for reliability reasons.

I call this an epic fail!

That’s why I believe in “evolution” and regarding OpenGL since OpenGL 3.0 release I would say it’s a FAST evolution.

The profiles and extensions system is good for evolution and anyway we can say “I’m writing a program for this version” accepting his advantages and inconvenient.

This version won’t probably be the last version of OpenGL so that you can think ahead about how your software could to evolve and design it consequently.

I was really into OpenGL Long Peak but I’m not anymore as the ARB proves they could handle much more work they use to do. We will see how it will continue to evolve but I am definetly confident.

I can easily imagine how revolutionizing the API could effect drivers that are easier to maintain. I mean, who hasn’t refactored code to make it easier to maintain?

But in this case, we have to consider legacy apps. IHVs can’t just stop supporting current-generation APIs. So even if these next-gen drivers were easier to maintain, they wouldn’t see the benefit for at least 5 years, and during that time they’ve effectively doubled their workload, because now they have to maintain two different drivers.

I was really disappointed when Longs Peak failed, but in retrospect, it shouldn’t have been a surprise that it failed. (Even if it did look really elegant.)

You know, I’ve read that article before, and it has a few sticking points with me (IMO, code does “rust”, if that same code can be written in a better way for current compilers/libraries/hardware) but I do agree with the basic sentiment that while it’s okay to rewrite portions of your code, it’s usually a mistake to rewrite an entire program when it comes to large-scale apps.

well i’ve wrapped the whole api in my own api.
maps well onto ‘other’ api’s too.
frankly makes no difference to me what happens with GL now. Rarely touch it, except through surgical gloves (i.e. my wrapper interface).

Nice article, and I agree with it in most parts, though I beleive code ‘does’ rust. Your own code may be fine, but Operating Systems and the hardware that surround your software change. Large portions of your software may simply be out of date. Worse, the core functions that every other class/functions use may rely on now defunct technology. Apple rewrote their OS from scratch and it did just fine. If anything, it knocked everyones socks off…

Apple rewrote their OS from scratch and it did just fine.

No, they didn’t. They rewrote parts of it. And took other parts from NEXT, which already existed and was well-tested. And they ported other parts from their current OS over.

As for “it did just fine,” maybe after a couple of revisions it did. But Mac OSX 10.0 was… let’s just say, not a quality release.

That being said, 10.0 was vital, which is the point I think Joel missed. Sledgehammers exist for a reason. It’s overkill to take one to a nail, and it’s likely to cause more problems than it solves. But if you need to put a hole in a wall, it’s the right tool for the job.

Same thing here: MacOS pre-10 was ancient and decaying; it didn’t even have proper memory protection for regular memory allocations (to get memory protection, you had to use a special API to get your memory). To upgrade it to a modern OS would take just as much work as taking a sledgehammer to it. And the resulting code from the upgrade would likely not be as clean, modular, or functional as the freshly-written code.

Sure, the initial releases had some issues. But things are so much better now than they would have been without the sledgehammer that it’s hard to say that it wasn’t the right idea. Short-term thinking isn’t always a good idea.