What is happening?

Game companies are probably not interested in backwards compatibility.

1- Technology-wise, I can imagine that lots of developers want OO, the way that DX offers.
2- Quite a number of them want to program a single path and that’s it.
3- DX is a collection of APIs. Perfect for newbies learning game making. I don’t know if companies like EA care. I don’t know if they use the D3DX functions.
4- I think the biggest reason is tradition. Companies started with DX. Why should they switch?

That’s some of the stuff I collected. It’s just my opinion.

There’s no such thing as a single path in dx. Caps replace extensions, that’s the only difference.

Originally posted by CrazyButcher:
you’re right marco, the masses of games in total comes from consoles.
I don’t know where you live, but in my country it’s more like 50%/50% (or 49%/51% to be precise).

Originally posted by tfpsly:
[quote]Originally posted by CrazyButcher:
you’re right marco, the masses of games in total comes from consoles.
I don’t know where you live, but in my country it’s more like 50%/50% (or 49%/51% to be precise).
[/QUOTE]look at the sales volume world wide.

i looked at dx once i think. it looks like most high level microsoft APIs to me. which translates personally into, ‘no way am i touching this’. usually microsoft has lower level APIs though, which are less cumbersome to work with… at least for the os runtime APIs there is generally more than one way to get something done. so dx probably has a low level equivalent too.

but it definately looks like a hornets nest from the outside at least. speaking from the perspective of someone who has a personal ire for ‘scheduled obsolete’ code… i would say opengl is the pick for a strong foundation that will last through the ages with minimal upkeep. dx is maybe for something you plan to toss out back in a couple years or constantly renovate.

dx i figure probably bares the mark of microsoft code in general. extremely rushed, overmanned, and developed by people who have no real personal stake in it. ie: don’t actually use and depend on the software.

personally though i just hate the way dx code looks, and have no investment in microsoft.

opengl i’m happy with though on the whole.

i could easilly see directx die, opengl, no way… people would rush to raise it from the dead if its corporate sponsors fell out.

I started with opengl then not understanding its heavy state machine reliance I switched to d3d and it wasn’t until I needed register combiners that I switched back to gl and by then the programming paradigm has clicked in and I got rid of lots of d3d code during my project conversion to gl. Also, the way I kept in touch with gl while coding in d3d was by translating some gl code to d3d and the more I learned about gl the more I liked it. I grew up with d3d 5,6,7,8,9 and the constant recoding fundamental things like framework init code that got on my nerves. Did you know that d3dx7 had the shortest init code of any version? They also have that duality going between d3d and d3dx so if you strip away d3dx you get pretty bare api, much less than gl. Plus, I replicated most of d3dx in my own code for control purposes as that api has some nasty bugs. I don’t want to wait for ms to fix them if ever. Plus, d3d is over designed and poorly documented for doing advanced things with it. I still remember pulling my hair out implementing the right function call sequence for the d3d font interface and have it work during d3d reset calls. You know, when you loose surfaces. Bah, I needed to get this out of my system.

'kin awful, isn’t it?

Most of the best selling 3D games for the last two years used OpenGL. There are also good games that use D3D so you can make good games independent of the API. Even titles from the last months like Chron. of Riddick are OpenGL and titles like World of Warcraft you can select OpenGL or D3D. For me, the only problem about OpenGL is the lack of good support of ATI for high end features (even when its hardware supports them) but they are improving it (they are now making the XBox and Nintendo’s Revolution graphics chips so I presume that this is the cause of the slowdown in their driver enhancements in the last months).
I have learned last versions of D3D for helping other people and I really prefer OpenGL over D3D. But this is my personal opinion.
Anyway, I feel like there are more people switching back to OpenGL than to D3D now that we have a decent ARB collection of extensions. I also think that switching form 1.5 to 2.0 has been a good ‘marketing’ decision to get the attention of some developers.
Also the rumors of PS3 and Revolution to use OpenGL ES or OpenGL clone as their graphics API seem that is making some developers to look back to OpenGL to see that it has improved a lot since last time they probably ‘visit’ it (probably 1.2 times)
What I found for ‘newcomers’ is that they get confused when they see the big list of extensions and they don’t understand why if some of them are deprecated, obsolete or included in the last versions of the API they are still exposed. For example, they don’t know if they have to use CVA, VAR or VBO and they have to learn about them, ask other people to understand that VBO is the current recommended way and that CVA is even undefined.
Maybe a way to know real extensions for current API version could be good. I think that a function like glGetString(GL_EXTENSIONS_2), that shows only 2.x extensions, could be good (and GL_EXTENSIONS to continue exposing everything for not loosing backwards compatibility). This way we can forgot to see EXT_vertex_array in the extension list and have to explain new people that this is included in the API since 1.1 or 1.2 (I don’t remember exactly)

Don’t agree. To address the problem you’ve highlighted (new developers being confused by multiple ways of doing things because old extensions are still supported in the drivers) you don’t introduce a new way of querying extensions, you possibly produce and keep up-to-date a document in the extensions registry webpage which details the commonly used extensions. It’s a documentation issue rather than a problem with the API.

indeed, who hasnt visited delphigl’s excellent extension useage site to look for what is spread. As I previously mentioned some sort of centralism when it comes to information (such as the dx9sdk) would be nice to have for GL. of course there is the spec and the extension specs but well not that comfortable imo.

Originally posted by knackered:
There’s no such thing as a single path in dx. Caps replace extensions, that’s the only difference.
Yes, but I said companies want to code a single path :slight_smile:

Some people have this impression that you have to resort to vendor specific extensions in GL, while in DX you don’t.

Look at HL2. Does it optimize for Geforce FX?
Does it use normalization cubemap lookup instead of math?
Does it use half precision temps? (if it’s possible in D3D)

Doom3 goes through that trouble.

It’s not a matter of API. It’s a matter of people’s opinion on the APIs and they want to do the minimum work.

I make my own gl headers so that’s the only major work for me. Otherwise I rely on gl core and sgi docs and very few extensions. Also, why aren’t there cap bits in d3d from ihvs? That boggles my mind. I don’t have a computer with ati installed so I can’t use that d3d caps utility to check them. Each d3d ihv driver should show supported caps. I remember pulling my hair out trying to figure out ddraw clipper and how to set it up. The migration from d3d 8 to 9 had tons of errors in the docs and I was constantly sending bug reports to dx team. They did write back though so that’s good. It got so bad one time that I thought about using html help and decomposing them into html files and editing them by hand then zipping them back to chm file. Gl docs are awesome by comparison. Plus ihv extension docs are nifty as well as they give you much needed details. That VBO issue I had in gl ie. driving it efficiently is the same way under d3d. I noticed they had to write that proper VBO usage in d3d docs as well. Don’t get me started on d3d texture stages and the really screwed up documentation as well as actual usage of that mechanism. Gl arb_multitexture is a dream to work with. Heck, reg.cmbs. are easier to work with than d3d texture stages. Crossbar is an absolute savior in gl.

Originally posted by Mark Kilgard:

Direct3D out-software engineered the ARB when it came to engineering a programmable shading language. Direct3D builds its shading language implementation into a redistributable library that you can package with your application. The library targets a (tokenized) assembly interface. So a new language feature (or compiler bug fix) can be utilized without necessitating end-user driver upgrades (and reboots) by just using the latest compiler library.

Yes! This is precisely why I had to bite it, and turn to d3d. On the other hand, you will never make me use CG, because other hardware vendors will never support it. How long does it take for the ARB to realise that this was a bad choice?

Oh yeah, and another thing making me stay in d3d world - The Debug Runtime™, once you’ve used it, youll never turn back two 5-enum world(with INVALID_OPERATION being used say 99% of the time?).

Direct3D out-software engineered the ARB when it came to engineering a programmable shading language. Direct3D builds its shading language implementation into a redistributable library that you can package with your application.
ARB_fp/ARB_vp is obviously better than the D3D Shaders, because it abtracts the instruction and register count, but an extended vendor independent version for vs/ps30 is needed. ARB_fp/ARB_vp is a better compile target than GLSL, because GLSL is too high and introduces another level of indirection. Compilation does not necessary mean another language. Modern game worlds need many different shaders and not just one or two like Doom. I personally think that shaders won’t be written, but instead dynamically generated from some higher level (possibly, but not necessary graphical) material definition like in UE3. You need different shaders for each light material interaction and for each pass and it is impossibly to create just another shader or a long .fx file because texture x should scroll. I don’t see how current high level languages can solve this problem in an elegant way. So the asm style interface is still usefull.

ARB_vp/fp better than D3D ?

If you using the ‘extended’ ARB_vp/fp, then yes, it’s better.

ARB_vp/fp

pro:

  • Invariant position support (smaller vertex program, so more efficient).
  • states matrices (saves some environment constants, useful for code using skinning for example, where you need a lot of environment variables for storing the bones matrices).

cons:

  • limited macro support (m4x4, sin for example, you need to emulate it (writing sin in VS shader on OpenGL is quite ‘annoying’, you need the nVidia extensions in order to get it, which is proprietary).

  • no support for “older” video card like Radeon 9200, yes it’s a old card, but Apple is still shipping it in there Mac Mini, so you miss a lot of features (could do bump + specular mapping, or some nice effects, using proprietary ATI extensions is out of question … ).

  • no vertex program support on some ‘indie’ video card like S3 or Matrox - no ARB vertex program, which are available even in DirectX (and it exposes Vertex Shader version to 2.0 !). Simply thoses company cannot afford OpenGL development.
    For DirectX driver, Microsoft already gives specs for writing DirectX driver, so it’s easier to start with. For OpenGL, each vendor is on their own apparently.

  • no generic software TNL for every video cards that doesn’t support it (Apple does it).

For example, Apple exposes ARB_vertex_program in software on every platform, even ATI 128 video card. Yes it’s software emulated, but it is quite convenient and performant enough.

  • no GL vertex shader on older ATI video cards. Even the Geforce3 can do GLSL in vertex program.
    You have a limited support, but you can write a GLSL that just fits the specs. of the Geforce3.
    It saves a lot of work.

Same for GLSL fragment programming on ATI 9200. For example, I’m writing HLSL (DX9) code that fits with a PS2.0 model, then I write the same code that fits the PS1.4, just changes a bit the code in order to remove the limitations (like doing norm. cube map for normalizations …)

So basically D3D shaders is ‘more’ scalable (support more hardware finally) than OpenGL.

If you plan to release a game, D3D gives more video cards support, so more customers.

I guess it’s the ‘real’ reason. However this argument might be ‘off topic’ in the next year to come.

And finally, in Longhorn, OpenGL will be on top of DirectX 10. It will be wrapper (GL->DX) “for security reasons”. That, will be a killer.

(see http://www.theinquirer.net/?article=21077 )

Originally posted by execom_rt:
An finally, in Longhorn, OpenGL will be on top of DirectX 10. It will be wrapper (GL->DX) “for security reasons”. That, will be a killer.
Wow, that would indeed be a killer…so new extensions would be basically impossible, perhaps excepting those that just expose DX10 functionality? Yikes.

Has the ARB ever certified a wrapper as a conformant OpenGL implementation? Or would they be certifying the wrapper+DX10 driver taken as a whole?

Originally posted by mikef:
[b] [quote]Originally posted by execom_rt:
An finally, in Longhorn, OpenGL will be on top of DirectX 10. It will be wrapper (GL->DX) “for security reasons”. That, will be a killer.
Wow, that would indeed be a killer…so new extensions would be basically impossible, perhaps excepting those that just expose DX10 functionality? Yikes.

Has the ARB ever certified a wrapper as a conformant OpenGL implementation? Or would they be certifying the wrapper+DX10 driver taken as a whole?[/b][/QUOTE]I’ve read that only the MS default driver will be a wrapper. A good thing is that it will support OpenGL 1.5 .

Originally posted by execom_rt:
[b]ARB_vp/fp better than D3D ?
cons:

  • no support for “older” video card like Radeon 9200, yes it’s a old card, but Apple is still shipping it in there Mac Mini, so you miss a lot of features (could do bump + specular mapping, or some nice effects, using proprietary ATI extensions is out of question … ).
    [/b]
    Yes considering that the ARB-fp is “ps 2.0” what did you expect? Also note that not all things are rosy on the D3D side of things in early ps1.1-1.4 era as there are many valid shaders that ATI/Nvidia cannot perform. It is only when you look at the OpenGL docs that you find out why they break. (eg. Nvidia - with 3 contant lookups in one instruction, ATI - some of the more complicated tex3x3vspec type instructions) So you end up having to write multiple shaders to work around driver issues anyway…

[b]

  • no vertex program support on some ‘indie’ video card like S3 or Matrox - no ARB vertex program, which are available even in DirectX (and it exposes Vertex Shader version to 2.0 !). Simply thoses company cannot afford OpenGL development.
    For DirectX driver, Microsoft already gives specs for writing DirectX driver, so it’s easier to start with. For OpenGL, each vendor is on their own apparently.
    [/b]
    I dunno about matrox but other “small” companies like S3/SiS and Intel do advertise ARB-vertex/fragment_program

[b]

  • no generic software TNL for every video cards that doesn’t support it (Apple does it).
    [/b]
    Nvidia does, so complain to ATI (or perhaps MS?)


If you plan to release a game, D3D gives more video cards support, so more customers.

I would seriously doubt that any PC gamer would not have a decent OpenGL implementation running. (mostly thanks to the many FPS’s using OpenGL) If you are targeting your game at low end “Mom-and-Dad” players, you can almost forget high end features anyway, as even IF they have a half decent video card, the drivers will be so out of date most features will break.


An finally, in Longhorn, OpenGL will be on top of DirectX 10. It will be wrapper (GL->DX) “for security reasons”. That, will be a killer.
(see http://www.theinquirer.net/?article=21077 )

Don’t believe everything you read. MS is just updating the default OGL driver that ships to be 1.2 and use D3D for hardware acceleration. Once a user installs the manufactures drivers, OGL will be as it is now.

Ok here are my opinions/feelings:

· I wanna sell my soul and spirit to M$oft, because OpenGL is a non-profitable?? thing and M$oft is the devil, has da $$$, is sodoma-gomorra and can make me rich as rich as the ppl you mentioned above!!! Da problem: Murphy’s Law… when you can sell your soul to Mr.666 the soul bag is FULL !!! Now FEEL the 666(9?) powah! :smiley:

· M$oft promised me if I use XNA/DX10/WGF they will allow me to play with its XBox2/360 Halo97 !!! yay!! Sony promised me if I use its PS3 with OGL they will send me a free copy of the new Final Fantasy 943… BUT see this ouch!!!.

· M$oft told me assembler/C/C++ is not good, is better to use Visual Basic style .NET “managed” thingy … but atm .NET doesn’t support well the C function pointers to use OpenGL extensions, so I need to use a very ugly 3rd party wrapper for OpenGL … (ok Tao is pretty decent atm, but is a 3rd party, not official thingy, can dissapear one day and then what?? )

Well at this point I am using Managed DirectX 9 / Windows Graphics Foundation over c# … So now I am a very happy windozed person. I really prefer this

Disk.Format(Drives.C,Wait.Yes)

to this

(pFnFormatHardDrive)wglGetProcAdress("ARBFormatDrive");
pFnFormatHardDrive(GL_RGBA/*ouch C is NOT strong typed so I can pass here this constant instead the correct one*/);
while ( 0==glIsFormatCompleted() ) {}

· Longhorn ( WinFX API really ) and non-managed C++ will be a BAD IDEA … so bye bye native GLUT and openGL C function pointers to use extensions, etc…

See this Windows Longhorn FAQ . Go to General Topics -> The Basics -> Can I use C++ to develop for Longhorn? “If you mean Standard C++ that targets a specific chip by producing assembly code, you’re going to be doing a lot of .NET interop to make it work. I recommend the other thing”

Also, see this link about <a href=“http://www.extremetech.com/article2/0,1558,1629317,00.asp” target=“_blank”>
Windows Graphics Foundation</a> . No more VS/fragment shaders!!! UNIFY them! No more device caps! IHVs forced to accept min specs! Use GPU as a calculator! No more fixed fipeline and old things! Real HW-3d accelerated desktop! Alpha faded 3d windows!

Free your mind, I know you are scared … People fear the changes and are not prepared, and blah blah

· I think the future are languages like Java and .NET/linux go-mono. Assembler/C/C++ will be dead.
Oooh, see… casually Mr.Epic Tim Sweeney is thinking about The New Programming Languages

OpenGL should be prepared for the revolution because it is not atm … DX is, using MDX9/WGF. See this interview with Miguel the Icaza about
<a href=“Gnome to be based on .NET – de Icaza • The Register” target=“_blank”>
Gnome 4.0</a> . He says “Gnome 4.0 should be based on .NET”

·I am VERY scared about the future …

- [b]PlayStation 3[/b] uses OpenGL2-NVIDIA "DreamForce" with C/C++ over IBM Cell procesorS. See  [PS3 is easy to program](http://www.neoseeker.com/news/story/4385/)  and  [IBM\'s Cell Processor in detail](http://www.blachford.info/computer/Cells/Cell0.html)  Notice the Cell processor would accelerate by hardware Java or .NET, or even drive your car!!! Also, see how this Skynet machine can kill the humanity... err.. I mean  [IBM BlueGene Supercomputer](http://domino.research.ibm.com/comm/pr.nsf/pages/rsc.bluegene_2004.html)  can pass the PetaFLOP capacity using Cell in the future !!! Also see the  [Upcoming IBM\'s CPUs after the Cell](http://www.xbitlabs.com/articles/editorial/display/tech-process_10.html)  which is REALLY impresive ( if you know chemicals, that form of carbon is after the Fullerene, what is almost a Nobel-prize thing... )

I am sure MarkJ can tell us a few about the PS3, we have to press him until he violates the NDA and tell us all the Zion codes…err…PS3 info and NVIDIA G70 “DreamForce” GPU…

- [b]XBox2[/b] uses XNA and .NET, ATI R500, over PowerPC procesorS. I am not completely sure what the hell is  [XNA](http://www.microsoft.com/xna)  Perhaps is Visual Studio .NET + a good asset manager like Alienbrain ala M$oft style? Could I program the xbox with c# some day or will be only C/C++? Does xbox2 uses longhorn CE powerPC version?

- [b]Nintendo Revolution[/b] probably will use Java, ATI Hollywood GPU over a PowerPC. See this  [Nintendo Revolution FAQ](http://cube.ign.com/articles/522/522559p1.html)  for more details.

- [b]My PC[/b] uses Longhorn beta with VS2005 .NET over an AMD64. I program with c# 2.0 and Managed DirectX/WGF beta.

- [b]My mobile[/b] phone uses Java 2 Micro Edition with MIDP2/JSR184 over ... a 1Khz processor???? Oh see, Carmack is playing with mobile phones too  [http://www.armadilloaerospace.com/n.x/johnc/Recent%20Updates](http://www.armadilloaerospace.com/n.x/johnc/Recent%20Updates) 

Is the future a chaos where portable tools like the wonderful multiplatform 3d engine Renderware Graphics 3.7 are not possible???

We should made the Fahrenheit thingy years ago… Now I see the darkness of 900000 different programming languages/APIs coming thru and making the things even more difficult for programmers who want to port its games to multiple platforms…

· OpenGL is not better/worse than DX… Is just different … OpenGL is based on the old C style paradigm with portabillity in mind, while DX9 is in object oriented C++ paradigm with PC/Xbox in mind and WGF/Managed DX9 is pointing to the future “managed” .NET languages and operating systems.
I see evolution in M$oft… but I see only small changes in OpenGL. OpenGL HAD the innitiative when Win95/WinG/Talisman appeared… and now is OpenGL who follows and copies DX… mmmmmmm… think about this … while we were programming shaders using assembly DX had HLSL with shader model 3, PRTransfer, fx techniques, D3DX library using SSE, HDR, geometry instancing, good managed .NET wrapper, debug runtime …

· We saw important people to left OpenGL, starting with M$oft. Now we see other ppl to go and other people to support xbox2 instead of PS3 … “This is a war, and we(ogl) are…” in the middle …

· So wake up Neo, welcome to the real world and follow the M$oft palladium ribbit because OpenGL is a VIRUS for Windows!

· Sorry , this won’t end tonight, it is INEVITABLE, Mr.Gates :smiley:

· UUB Codes for LIST sukz and not, I AM NOT obsesionated with Matrix… :stuck_out_tongue: :stuck_out_tongue: :stuck_out_tongue: :stuck_out_tongue: :stuck_out_tongue:

Wow. That’s one of the most interesting posts I have read(also regarding the style:). Really scarry links. Scarry and hopeful. Scarry for my current habits. Hopeful for the future abilities. However I think OpenGL will adapt. OpenGL or OpenGL ES or OpenGL whatever. It will. I hope… Changes are indeed inevitable.