Dx11 catchup

I’ve heard this as an excuse (games sell cards, games use D3D, so no reason to put a lot of effort into OpenGL) from employees at a major chipmaker whose OpenGL support is so crappy that most developers will revert to software OpenGL rather than use that vendor’s crappy drivers.

You could have just said “Intel.” It’s not like we don’t know who you’re talking about :wink:

There’s no longer any reason to buy the latest and greatest graphics card to play the latest PC game. You can just buy that same game and play it on your console on your 52" plasma and 5.1 surround sound.

Two things.

One: I am typing this on my computer, which is connected to a 100" projection screen with 5.1 surround sound.

Two: Buying a game for a console requires buying the console. And it can also lead to diminished experience. As a Team Fortress 2 player, I wouldn’t even be playing the game anymore if I had the 360 or PS3 version. No updates and no mouse-look == fail.

And consoles aren’t getting StarCraft II or Diablo 3 or WoW. Or Civilization (a real Civ game, not watered-down crap). Or quite a few other games.

Having a game-quality PC is still important. It may have diminished importance, but many people do it, and many games still require it.

Furthermore, let’s say you’re right. If you are, then that means that GPUs don’t matter. It’s not that a lessened focus on gaming will suddenly make OpenGL better. It means that people simply won’t care about 3D graphics, because the primary uses of that is games. If you aren’t playing games on your PC, you aren’t using your card’s 3D capabilities (outside of things like Aero Glass and such).

So doing no 3D on desktops isn’t going to help OpenGL in the slightest.

This a totally ridiculous statement. The market is apparently a bigger place than you’re aware of.

This a totally ridiculous statement. The market is apparently a bigger place than you’re aware of.

OK, then. What do you need a Radeon HD 5850 for besides games? What will that do that a cheap, $50 G70 part from NVIDIA won’t do?

And I’m not talking about niche markets either, like flight simulators, high-performance computing, and such. These people use machines built specifically for that purpose. I’m talking about desktop PCs, which are used by ordinary people.

Then you should have stated that. You asserted that if games aren’t the driving force behind GPU sales on PCs (which probably isn’t true, but whatever), then GPUs don’t matter, and implicitly nothing but games applications of GPUs matters. To you maybe, but your world is small.

And once again, you’re wrong. I’ve worked in both of your example markets (HPC/scientific visualization and commercial flight simulators), and I can tell you – it used to be “machines specifically built for that purpose”. Many that clung to them are dead or dying now. Today I believe, by and large, these apps are powered by plain-old PCs powered by off-the-shelf mass-market GPUs.

Today I believe, by and large, these apps are powered by plain-old PCs powered by off-the-shelf mass-market GPUs.

The bread and butter of the GPU industry is people who want to play games. It is gamers who drove down the cost of GPUs to the point where these apps can run from off-the-shelf GPUs. But there simply aren’t enough of them to keep the GPU industry going.

Bottom line is this: if you take the gamers out of the GPU market entirely, you’ll see GPU prices go up dramatically, simply from the massive lack of demand. Most people only care about GPUs with regard to their ability to play games.

Apparently, my earlier attempt at a reply to this didn’t go through. Let’s try again…

Yeah, I’m sure most PC gamers have that setup too :stuck_out_tongue_winking_eye:

Are you really going to try to argue that buying a gaming PC and keeping it up to date (especially the graphics card) is somehow cheaper than buying a single console once every 5-6 years?? A console, at most, will cost you $600. A really good graphics card alone can cost that much.

Right, and we’re back to MMOs and sims. If those are your type of game, then you’re going to be playing them on a PC. Otherwise, you’re probably better off on a console. But also take note that those particular games aren’t a driving factor for buying new GPUs either. You can get by pretty easily with a low-end card to play those games. And it won’t be watered down either since WoW and StarCraft and Civ were designed to run on cards from 5 years ago.

No, it means that new GPUs for PC games don’t matter. If you take a look, many if not most new PC games still only require DX9. Why bother with a DX11 GPU?

And that’s the point I’m arguing. Games are no longer the sole user of 3D graphics on the PC. Perhaps my point of view is skewed because I work in graphics, but it seems more and more applications have or require some sort of 3d visualization of some kind. If fact, it seemed it was difficult to capture VC interest without having some sort of snazzy 3d visualization. And Dark Photon is right, applications that used to require and run on high-end SGIs or specialized IGs are mostly running on plain old PCs nowadays (flight simulators, medical imaging, oil & gas, CAD, GIS, and many more). And this is still not counting the emergence of 3d usage in consumer electronics. A lot of things from TVs to set-top boxes to cell phones require and use OpenGL for their UI and/or video playback applications.

My point is that the non-games markets have grown signficantly in the past few years. Perhaps to the point of eclipsing what remains of the PC games market. And if the majority of non-games markets are using OpenGL, then it would benefit OpenGL if the hw vendors realized what is going on.

Are you really going to try to argue that buying a gaming PC and keeping it up to date (especially the graphics card) is somehow cheaper than buying a single console once every 5-6 years?

It’s strange that you bring that “5-6 years” up. Particularly since your later argument is that PC games aren’t as demanding of new hardware as they used to be. It’s entirely possible that an $800 gaming PC from 3 years ago would still be serviceable today. And it’s quite possible that an $800 gaming rig from today would last a good 5-6 years.

Right, and we’re back to MMOs and sims.

And FPS’s, unless you honestly believe that the best FPS experience can be had on a console without mouse&keyboard support. And RTS games (which are not sims).

PC gaming is certainly not master of the gaming industry. But it is far from dead. In-store PC game sales are down, but that is partially due to the increasing prominence of online sellers like Steam.

If you take a look, many if not most new PC games still only require DX9. Why bother with a DX11 GPU?

Because it’s faster at DX9 stuff than any DX9 GPU was. That was the reason why XP users bought DX10 parts too; because they were faster.

And most new PC games require DX9 as a minimum for three reasons:

1: Microsoft hurt DX10’s growth by making it Vista-only. The launch failures and public perception of that OS stunted the willingness of game developers to make games that significantly use DX10.

2: Because it is prevalent. DX9 hardware has been available for a long time. Just look at the Steam hardware survey; virtually nobody gaming has DX8-only hardware, so there’s no point in supporting it. DX10 adoption is relatively slow, though there are quite a lot of DX10 parts out there. DX9 is simply the current minimum level of functionality. Making a game that only 30% of people who might be interested can play is silly.

3: Outside of incremental hardware improvements (longer shaders, faster shaders, more uniforms, etc), there really isn’t much difference from DX9 hardware to DX10 hardware. Geometry shaders aren’t that big of a deal; they’re too slow in practice to be useful. And there weren’t very many other big ticket “must have” DX10 features that would fundamentally change how you make effects. DX10 hardware is used pretty much like DX9, except with more complex shaders. So making DX9 a baseline minimum is quite easy.

A lot of things from TVs to set-top boxes to cell phones require and use OpenGL for their UI and/or video playback applications.

I don’t know of a single TV that uses OpenGL; I don’t even know how it could, since you’re not allowed to run code on them. And I don’t know of a single cell phone that uses OpenGL either. I know of a few that use OpenGL ES, but that’s not the same thing as OpenGL.

I wish that it were. I’m sure ATI could do a much better job writing GL ES drivers for Windows than their current GL drivers. But unfortunately, OpenGL ES is only available for embedded systems.

Perhaps my point of view is skewed because I work in graphics, but it seems more and more applications have or require some sort of 3d visualization of some kind.

Such as? The “standard” applications, Office suites, e-mail programs and the like, have no need for 3D. And even if they do, it’s nothing that a GPU from half a decade ago couldn’t render with ease.

And Dark Photon is right, applications that used to require and run on high-end SGIs or specialized IGs are mostly running on plain old PCs nowadays (flight simulators, medical imaging, oil & gas, CAD, GIS, and many more).

And you really think that, if PC gamers dropped off the face of the planet, the GPU industry could survive just fine off of these kinds of specialized applications?

My point is that the non-games markets have grown signficantly in the past few years. Perhaps to the point of eclipsing what remains of the PC games market.

No, it has not. What is OpenGL ES used for on iPhones/etc? Games. Oh, you might get an application here or there that draws something in polygons. But the majority of non-gaming applications use 2D effects.

Your original point was that PC games drive buying the latest GPUs. New GPUs are come out every 6 months to a year. If you’re not buying those GPUs on a regular basis to put into your gaming rig to play games, then games are not driving the sales of GPUs. And that proves my point.

I play Unreal Tournament on my PS3 with keyboard and mouse. There’s really no reason for a developer to not support keyboard and mouse on consoles (especially Unreal Engine based games).

Close enough. In my opinion, they’re as equally boring and require just as much time that I simply don’t have. There’s a market for people who like those games and they can have it :wink:

I never claimed it was dead. But it is smaller than it was several years ago.

Pretty much proving my point that games alone aren’t driving the sales of DX11 hw. Games aren’t taking advantage of what’s available to them in DX10/DX11 because of the reasons you state.

I do. I’ve worked for two consumer electronics companies that have shipped TVs that use OpenGL for their UI. Ditto for the set-top boxes.

ES2.0 is close enough. It has most of the same restrictions as OGL3.2. It’s not missing that many core features compared to desktop OpenGL (though is missing some). At my current job, I’ve got a fairly significant codebase that uses the exact same OpenGL code for both ES2 and desktop OpenGL.

Last year I was at a startup doing GIS. Until very recently, 2D was the pretty much the standard in GIS visualizations. That’s changing now. Oh, and just because an application doesn’t need 3d doesn’t mean it’s not going to require it. Why the heck does Javascript need OpenGL bindings?? But it has them via WebGL.

I never said anything about PC gamers dropping off the face of the planet. Only you did. I did suggest that the entirety of non-games markets including consumer electonics have perhaps surpassed the PC games market and is worthy of appropriate attention by hw vendors.

More games using OpenGL only help my point that hw vendors need to stand up and pay better attention to the API rather than ignoring it in favor of D3D.

My gut feeling is that games are still the primary driver of GPU sales. However most gamers can play their games at max quality using “only” a low to midrange card.

The lack of need for a high-end GPU to play games must be the primary driver for NVIDIA and ATI to push GPGPU. It will be a huge new market for beefy GPUs, which I suspect will easily surpass the hardcore gaming niche.

Currently OpenCL is looking very attractive I think, so OpenGL could benefit indirectly when OpenCL-OpenGL interaction is added to the specs.

I think it’s safe to say that as long as there are PCs people will play games on them. :slight_smile:

Seems to me the arguments so far are different branches of the same tree. We’ve got 3 major technologies at work here: computation, display and human interface. The business models we’ve seen so far tend to capitalize on an unavoidable economy of scale; from the cell phones to the notebooks to the laptops to the consoles to Alfonse Reinheart’s main frame and home theater - they each appeal to a particular age group, interest and budget.

Now word on the street is that we’ll eventually see these technologies funnel into a single ubiquitous virtual experience in the form of display implants, direct neural-link human interfaces and such - you know, cybernetics. In the meantime quantum, optical, molecular or something completely different could steal the show in the next decade or 2. Though the issue of scale will likely persist in its various forms and will doubtless serve as bases for the unseen markets of tomorrow.

. . .

Bla bla bla, bla bla, bla bla.

Just my two cent.

Oh, did anyone so far mention this:

http://tech.slashdot.org/story/10/01/08/1830222/Why-You-Should-Use-OpenGL-and-Not-DirectX?art_pos=15

It’s hilarious because the guy “debunks MS’es false information” and spreads a whole lot of false information himself. But well, it’s good advertisement for his awesome little company that will rule the world, yeah!

Now please continue.
Jan.

Your original argument was that hardware makers catering to the non-gaming crowd will miraculously cause their OpenGL drivers to get better. I don’t see this happening, for two reasons.

There are two groups of non-gamers interested in GPUs: people doing visualization, and people doing GPGPU. GPGPU needs OpenCL, not OpenGL, so catering to that crowd will divert resources away from OpenGL drivers.

Visualization users generally are like Dark Photon: they tend to have complete control over what hardware that their users use. So really, driver quality is important only to the degree that their code works for the given platform.

Your original point was that PC games drive buying the latest GPUs. New GPUs are come out every 6 months to a year. If you’re not buying those GPUs on a regular basis to put into your gaming rig to play games, then games are not driving the sales of GPUs. And that proves my point.

My point was that PC games drive buying of GPUs period. Whether gamers are buying the latest or not, they’re still the primary deliberate consumer of GPUs. While a few gamers have been willing to buy $400+ cards, you will find that the sales curve has always skewed to the $100-$200 range.

Games aren’t taking advantage of what’s available to them in DX10/DX11 because of the reasons you state.

As I pointed out, there’s not much functionality difference between DX9 and DX11, so there’s not much to take advantage of. Also, as I pointed out, there is a substantial performance difference between DX9 cards and DX11 cards, which game developers are taking advantage of.

Close enough. In my opinion, they’re as equally boring and require just as much time that I simply don’t have.

The entire nation of South Korea would like to disagree with you.

ES2.0 is close enough. It has most of the same restrictions as OGL3.2. It’s not missing that many core features compared to desktop OpenGL (though is missing some). At my current job, I’ve got a fairly significant codebase that uses the exact same OpenGL code for both ES2 and desktop OpenGL.

And yet, they are not the same. They are compatible to a degree, but they’re not the same thing. ES 2.0 has none of the legacy cruft that GL 3.2 does. That’s why you’ll find ES 2.0 implemented on various bits of hardware that you’d never see a legitimate GL 3.2 on.

Why the heck does Javascript need OpenGL bindings?? But it has them via WebGL.

Games. Web applications are becoming an increasingly big thing. Once you can do client-side OpenGL rendering, you can run JavaScript-based games in a web browser.

Granted, lack of Internet Explorer support is pretty much going to make WebGL stillborn. But it’s a good idea anyway.

And again, it is OpenGL ES, not regular OpenGL.

The lack of need for a high-end GPU to play games must be the primary driver for NVIDIA and ATI to push GPGPU. It will be a huge new market for beefy GPUs, which I suspect will easily surpass the hardcore gaming niche.

Outside of entities doing serious number crunching, what good is GPGPU to the average user? The most you can get out of it is accelerated movie compression. That’s useful to a degree. But I don’t think there are very many actual human being is going to buy an HD 5850 just to make

Of course, the HD 5850 does include double-precision computations, which is something the HPC people have been pushing for. However, catering to the GPGPU crowd doesn’t mean improving OpenGL; these people want to use OpenCL.

so OpenGL could benefit indirectly when OpenCL-OpenGL interaction is added to the specs.

OpenGL (and D3D) interaction is already part of the OpenCL spec.

Oh, did anyone so far mention this:

No. It’s sufficiently stupid (as you rightfully point out) that it doesn’t deserve mention. I still don’t know why that was linked on the OpenGL.org main page.

If only this were true…

CatDog

There won’t be 3.5. Only 3.3 and 4.0 ath the same time :).

EDIT: Oops, sorry, didn’t see that there are another pages.

OpenGL has become bloated. Each vendor is pushing their own set of features and proprietary extensions instead of adding to the ARB set. New major version should better cut down on this and get things sorted out or the OpenGL is going to die.

You’re entitled to your own opinion, but without some substantiation (specific examples), it’s unlikely to be taken seriously.

Make this criticism constructive by suggesting a specific change.

If bloat doesn’t float your boat, look at 3.2 core. Lots of older functionality deprecated and removed. Very clean (downright squeaky).

I’m going to say something weird if it was said a year ago:

OpenGL is just doing fine.

Really

Agreed.

Nice and subjective. I like it!

Vendor of what, exactly?

So… vendor of GL extensions? In the past decade there’s been practically three companies that have registered extensions: nvidia, amd, and apple.

I’m not so clear on how these companies have “pushed” their extensions (and therefore features). Marketing? This is OpenGL, marketing doesn’t exist. Use-my-feature-or-die strong arming? Developers seem quite content to use DX so strong arm tactics would be silly. “Feature X only available on Y hardware”: If it’s not core most developers won’t use it unless they have control over customers’ hardware.

Basically, you are “pushing” your argument into crazy town. (Sorry, details in crazy town are few and far between so I can’t expound on this.)

True. The next version should do away with extensions to do away with the advantage they give certain hardware/companies. Extensions are like cancer, slowly eating away at good APIs. I say we practice modern techniques, such as (if I may extend my metaphor) performing gene therapy on GL by injecting it with DX. Gene therapy always works.

Then GL will LIVE. (Or become a zombie and kill us all, it depends on Fermi, I think.)