Your original argument was that hardware makers catering to the non-gaming crowd will miraculously cause their OpenGL drivers to get better. I don’t see this happening, for two reasons.
There are two groups of non-gamers interested in GPUs: people doing visualization, and people doing GPGPU. GPGPU needs OpenCL, not OpenGL, so catering to that crowd will divert resources away from OpenGL drivers.
Visualization users generally are like Dark Photon: they tend to have complete control over what hardware that their users use. So really, driver quality is important only to the degree that their code works for the given platform.
Your original point was that PC games drive buying the latest GPUs. New GPUs are come out every 6 months to a year. If you’re not buying those GPUs on a regular basis to put into your gaming rig to play games, then games are not driving the sales of GPUs. And that proves my point.
My point was that PC games drive buying of GPUs period. Whether gamers are buying the latest or not, they’re still the primary deliberate consumer of GPUs. While a few gamers have been willing to buy $400+ cards, you will find that the sales curve has always skewed to the $100-$200 range.
Games aren’t taking advantage of what’s available to them in DX10/DX11 because of the reasons you state.
As I pointed out, there’s not much functionality difference between DX9 and DX11, so there’s not much to take advantage of. Also, as I pointed out, there is a substantial performance difference between DX9 cards and DX11 cards, which game developers are taking advantage of.
Close enough. In my opinion, they’re as equally boring and require just as much time that I simply don’t have.
The entire nation of South Korea would like to disagree with you.
ES2.0 is close enough. It has most of the same restrictions as OGL3.2. It’s not missing that many core features compared to desktop OpenGL (though is missing some). At my current job, I’ve got a fairly significant codebase that uses the exact same OpenGL code for both ES2 and desktop OpenGL.
And yet, they are not the same. They are compatible to a degree, but they’re not the same thing. ES 2.0 has none of the legacy cruft that GL 3.2 does. That’s why you’ll find ES 2.0 implemented on various bits of hardware that you’d never see a legitimate GL 3.2 on.
Why the heck does Javascript need OpenGL bindings?? But it has them via WebGL.
Games. Web applications are becoming an increasingly big thing. Once you can do client-side OpenGL rendering, you can run JavaScript-based games in a web browser.
Granted, lack of Internet Explorer support is pretty much going to make WebGL stillborn. But it’s a good idea anyway.
And again, it is OpenGL ES, not regular OpenGL.
The lack of need for a high-end GPU to play games must be the primary driver for NVIDIA and ATI to push GPGPU. It will be a huge new market for beefy GPUs, which I suspect will easily surpass the hardcore gaming niche.
Outside of entities doing serious number crunching, what good is GPGPU to the average user? The most you can get out of it is accelerated movie compression. That’s useful to a degree. But I don’t think there are very many actual human being is going to buy an HD 5850 just to make
Of course, the HD 5850 does include double-precision computations, which is something the HPC people have been pushing for. However, catering to the GPGPU crowd doesn’t mean improving OpenGL; these people want to use OpenCL.
so OpenGL could benefit indirectly when OpenCL-OpenGL interaction is added to the specs.
OpenGL (and D3D) interaction is already part of the OpenCL spec.
Oh, did anyone so far mention this:
No. It’s sufficiently stupid (as you rightfully point out) that it doesn’t deserve mention. I still don’t know why that was linked on the OpenGL.org main page.