One more hit on OpenGL?

Just noticed this on Slashdot, and I quote: “XNA isn’t a rehash of DirectX tools for the Xbox2, PC and WinCE devices after all, it’s a full-on assault on the gaming world, with the prize being complete dominance of the market”. Based on an article posted here: http://www.eurogamer.net/article.php?article_id=55585

So, another hit on GL? I’m starting to think that if it wasn’t for iD and Carmack, GL “gaming position” would be dead already.

What do you guys think?
Do ATI and NVIDIA guys like to post some thoughts?

What I can say, is that vendors other than the big three don’t have sufficient OpenGL support for a modern game. (Big 3 == ATI, Intel and NVIDIA).

I can also say that, even among the big three, our 3-months-old DirectX back-end has fewer compatibility problems than our 4-years-strong GL back-end.

If Linux or MacOS were to fall off our radar, we’d be unlikely to have long-term support for OpenGL in our business plan.

Originally posted by KRONOS:
So, another hit on GL? I’m starting to think that if it wasn’t for iD and Carmack, GL “gaming position” would be dead already.
I’m convinced that with Linux picking up momentum it is hard to see OpenGL fading away as it is the only option on the platform. OpenGL ES should also breathe more life into OpenGL development together with the possibility to access it from Java. After all if you think about it, there is no way that OpenGL would go away soon as there are a lot of markets / segments in which DirectX is not avaible and won’t be in the future in many cases.

It would be fun if they put OpenGL on the PS3. Unlikely, but a common complaint is that the PS2 is difficult to write for. The Xbox dev tools are phenonimal, especially the graphics profiling tool, XNA will be a great benefit to most XBox and now PC developers, to bad there is no equivalent for Opengl. :frowning:

Heh, the PS2 has no drivers at all, there’s nothing. Just packets, DMAs and microcode. PS3… I dunno what it’ll have, probably not OpenGL direct from Sony. Oh well, amazing people do things this way and just think of the resources it’s cost worldwide with everyone implementing their own VU microcode. Still, some of the things developers manage to do on the PS2 make your head spin.

If you actually watch the interview mentioned in the first post you’ll find it pretty much has nothing to do with XNA. It’s mostly about Microsoft’s branding and content delivery strategies. It’s not about API.

It was stated in the announcements for the PSP that it uses OpenGL for its API.

There are alot of pretty popular games that use opengl. Games from Bioware such as baldurs gate series use some OpenGL, NWN, UT games have OpenGL rendering, Diablo2, Warcraft 3 and World of warcraft have OpenGL renderers, all games based on the id engines, of which there are ALOT all use OpenGL, there are countless others.
It would be pretty cool I think if PS3 had support for OpenGL, we’ll have to wait and see.

What part of OpenGL was hit by this? I’m very concerned because if they hit an important call like glBegin() we might be screwed. Can someone check the headers and symbol tables incase they hit something important? XNA seems like it might be quite a large piece of software so just a single hit might take out several functions.

Nah, only edge flags and color index attays were taken out. Noone will even notice.

I really wish MICROS~1.OFT had better helped the PC game industry and finally, for gods sake, done something about the horrible DirectX specific input lag problem, which completely destroys playablity in games played on low end cards.

Black Hawk Down, Tron 2.0, Prince of Persia SoT suck miserably because of the input lag, though framerate is modest but acceptable.

If you’re talking about how DirectX may queue more than one frame ahead in rendering, you can easily get around that by using various double-buffered framebuffer or texture target lock schemes. But the specifics of DirectX should probably be left to a DirectX forum. It’s interesting the gl-gamedev is pretty much dead, as far as I know, whereas DIRECTXDEV has a healthy, active community of commercial game developers. Don’t see too many of those around here, either…

As renderers for games begin to last longer than just one game, I could see opengl picking up again in gamedev if a major console supports. Though, to really tip the balance, there would need to be at least 3 platforms on which the renderer could run without much changes.

So if the PC and 2 next gen platform runs Opengl and if the sum of their markets is bigger than the XBox, then Ogl becomes an interesting option.

But I never really understood why Microsoft wanted it’s own 3d api anyway. When they introduced direct3d, they pretty much had control of the market for desktop computer. So the obvious target platform for games was obviously win9x. Why did they care that what 3d api was being used? I can’t really believe they were shaking in fear that people would move to another plaform just because you could easely recompile an application on it.

But I never really understood why Microsoft wanted it’s own 3d api anyway. When they introduced direct3d, they pretty much had control of the market for desktop computer. So the obvious target platform for games was obviously win9x. Why did they care that what 3d api was being used? I can’t really believe they were shaking in fear that people would move to another plaform just because you could easely recompile an application on it.
It’s a question of control.

D3D isn’t the preferred API because Microsoft didn’t adiquately support OpenGL. It is this way because of Direct3D 8.

This was the first API release that could be considered both good and superior to OpenGL (7 was nice, but it wasn’t as solid as 8). No longer could GL users point at D3D and say that it is categorically worse. More than that, however, it had shaders out of the box ready to use. Cross-platform (even though nVidia supported them from the beginning). They had decent performance with their vertex buffers, and they had the advanced features to be useful in the future.

OpenGL relied on nVidia’s extensions. This answer functioned right up until the Radeon 8500 came out. It had programmability, but it used its own set of instructions. It took another year after the 8500, beyond the 9 months or so it was between the 8500 and the GeForce 3, before we would see ARB_vertex_program. We still don’t have a cross-platform API for using NV20/R200 level hardware fragment programs.

Why did this happen? Because the ARB is a committee, and Microsoft is one entity. As a single entity, they can make decisions rapidly. The ARB meets 4 times a year to decide on ARB extensions; Microsoft Direct3D people can walk down the hall and talk to each other. Sure, Microsoft listens to hardware vendors and takes their needs seriously, but a single entity is ultimately responsible for the end result.

This has pros and cons. The obvious pro is speed-to-implementation. D3D can rapidly evolve. The obvious con is that, if the one entity gets it wrong, it’s bad for everyone who has to use it(see DirectX3, 5, and 6).

By creating D3D, it allowed Microsoft the speed-to-implementation that allows them to be in the best position to help game developers make games using hardware. OpenGL simply doesn’t move fast enough.

Quite frankly, I wish either someone would step forward to become the overlord of OpenGL, or a small group of IHV’s on the ARB would form a cabal. In either case, the purpose of this individual/group would be to create quick EXT extensions to add appropriate functionality in a platform-neutral way. Look how long we sufferred without VBO; that’s just rediculous.

EXT_render_target is exactly the kind of thing I’m talking about. No ARB involvement; just a few IHV’s getting together and pumping out a highly useful extension. The ARB can decide on its own time what to do with it.

Also, in many ways, the function of OpenGL (as a graphics standard) is somewhat incompatible with game development. Game developers need quick access to powerful hardware; if the API is horrible, they can work around it. Sure, they would prefer a good API, but they can get that in the next revision of the library. Constant API change doesn’t hurt them too much.

However, OpenGL is supposed to be a standard for graphics rendering. This means that it must be well-thought out because it may last for 10+ years. The audience for standardized graphics rendering can’t handle constant API changes or other things of that nature. They want the right thing, and are willing/able to wait for it. Their customers don’t need bleeding-edge hardware interfaces, so the committee approach is just fine. You add to standards or create new ones; you don’t modify them.

Quite frankly, I wish either someone would step forward to become the overlord of OpenGL, or a small group of IHV’s on the ARB would form a cabal. In either case, the purpose of this individual/group would be to create quick EXT extensions to add appropriate functionality in a platform-neutral way.
I disagree.

The members each do their own research, investing lots of $, give it away to GL for nothing, and probably license it to MS to be incorporated into D3D, and sometimes to each other.

It’s not a problem with doing things quickly. The VBO spec simply did not exist, but once it was written, it went into drivers pretty quickly.

Same thing is happening with EXT_render_target.

It’s not a problem with doing things quickly. The VBO spec simply did not exist, but once it was written, it went into drivers pretty quickly.
The problem is that the VBO spec should have been out 1.5 years earlier. The need for it arose from the Radeon 8500; until then, there was no decent compeditor to challenge nVidia, so VAR was a reasonable extension.

At that time, ATi should have either implemented VAR (not the best idea) or created a VBO of their own. They tried with VAO, but it wasn’t that great of an extension. The principle difference between VAO and VBO is the overloading of standard pointer functions. That something they could have found some solution for, rather than making their own.

The problem is in the generation of the specs, not how fast they are implemented. How long were we without an ARB_vp? We still don’t have a ps1.x level ARB extension. And EXT_render_target may not even be implemented by ATi; they might push for superbuffers. Nobody will agree to just spend the time to make a spec. It should not take 1.5 years to take VAO/VAR and build VBO from it; it isn’t that complicated of an extension. Same with EXT_render_target; we should have had this years ago.

Originally posted by V-man:
[b]I disagree.

The members each do their own research, investing lots of $, give it away to GL for nothing, and probably license it to MS to be incorporated into D3D, and sometimes to each other.
[/b]
This is just enlightened self interest and all they give away is the ability to develop for their cards. They know they need to sell hardware and apps need to use it. If they could cut each others throats by messing around with the API they would, but they’re smart enought to understand that this probably isn’t the best course of action.

OpenGL is just the programming semantics, these guys develop hardware, and that’s what they sell. As part of being a member of the OpenGL ARB there’s some sort of I.P. cross licensing agreement but it cuts both ways.

Korval, I still say that the ARB works better as a group instead of fewer members or 1 overlord.

Isn’t it true that EXT_rt is a collaboration between 3dlabs, nv, ati?

What about GLSL? It comes from 3dlabs and ati was pushing for it. There are no vendor specific HLSL. Anyone noticed that? Too complex :slight_smile:

The points you make are valid, but that was in 2000-2002 period.

Though I’ve been told there is extensions in development, they are currently not available in GLSL either.

The whole point of DX’s advantage is that they don’t have to write extensions. They release updates to DX fast enough to accommodate the different hardware vendors (in fact it seems like they switch off every major version or something :stuck_out_tongue: ).

Also, their default shading language path (effect files) are aimed much more towards backwards compatibility, which makes having functions that will compile only for specific shading language sets not a problem. It would certainly take a lot more work to rewrite a shading script to work optimally with tons of different shading paths. But that’s what you’d have to do anyways with GLSL. Sure there are plenty of ethical reasons why GL is better, but right now, DX seems to have a much nicer set up (not to mention many developers are entrenched :stuck_out_tongue: ).

I would still say that GL is easier to use than DX for normal stuff (triangles, display lists, state, etc.), but DX has a huge advantage with respect to shading. Sure a lot of the stuff they have there might belong on a higher level (glsl manager for instance), but out of the box makes it very nice (see other people on the GLSL suggestions forum complaining about various things…). And given that the game dev. community is getting more and more breakneck, ease of use is a huge consideration (especially when the world is converting to shaders).

The main problem, IMHO… Is that GL didn’t see it coming fast enough. I think 3dfx had some really good ideas for GL2.0, but given how long it’s taking to bring other stuff to fruition, not enough… I hope that the ARB has something else up it’s sleave for times to come, because right now, they are still on DX’s heals in many respects, but still behind.

Isn’t it true that EXT_rt is a collaboration between 3dlabs, nv, ati?

No, it’s 3DLabs, nVidia, and Apple. ATi is pushing superbuffers.

What about GLSL? It comes from 3dlabs and ati was pushing for it. There are no vendor specific HLSL. Anyone noticed that? Too complex
A vendor-specific HLSL, like Cg?