swtiching OpenGL to Direct3?!?!

Hello!
Easy question: How is it possible, that a game, written in OpenGL switches to Direct3D drivers!!! I thought that its so different…

Thank you all so much,
Fabian

Actually you can pretty much achieve the same result with OpenGL or Direct3D. Basically games/engines can be be built with an abstraction layer on top of the rendering APIs. The actual game/engine code never makes any direct OpenGL or Direct3D calls, but only calls to the abstraction layer instead. Usually the code for the abstraction layer for each rendering API resides in runtime-loadable libraries. That’s pretty much how it’s done.

Yes, but which API do you choose to base your abstraction layer on? OpenGL or Direct3D?

My first engine was based on both OpenGL and DirectX but after a while I found out that is not worth to have an engine supporting both. The complexity increases and you can’t quite match the speed of an engine structured on only one of the API. Look at the new engines appearing, UT decides to go only on DirectX now, Carmack is working only on OpenGL etc.

Anyway, if you still want to go with both, I think you should start with DirectX since is not flexible enough to cope with OpenGL. If you start working with OpenGL you can find out that it permits you to organize the data and logic in such a way that you can’t imitate it with DirectX so you have to change a lot (see the geometry buffers extensions (VAR, ATI objects), display lists etc).

Knackered,

Why would you base your abstraction layer on a single model? If you know that you need to target, say, OpenGL, DirectX, and PS/2 VUs, then you can design an abstraction layer that lets you do what you need on those systems, and it’ll probably look a little bit like each of them. It’ll also lack all the bits and pieces each of them have, but you don’t need yourself. Unless “you” are a middleware vendor, of course, in which case you’ll need to support every feature under the sun.

Note that the abstraction layer can feel free to sit at a higher level than the bottom-most grahpics. For example, it’s perfectly fine for an abstraction layer to expose a material factory, a mesh factory, maybe some PVS acceleration help, etc. Each implementation of the abstract rendering interface would then choose to do what’s best for the low-level API it’s implementing against.

Jwatte,
Yes, I understand what you’re saying - but I don’t like the idea of abstracting the renderer to the point that you have a function to, say, RenderShadowVolume(mesh) for each API you may choose to support. There’s just too much information that needs to be stored at the object level, rather than the abstracted renderer level - such as…opengl has several ways of dealing with vertex streams, d3d has another, and god knows how the PS2 deals with them (does it have a driver abstraction API in its SDK, or do you program it using byte codes, or something?).
We need to have different ways of storing the various memory models within each mesh object. Suddenly its not abstracted - each object has renderer specific members…

There’s lots of algorithms (such as the stencil volumes one) that all use the same sequence of commands, so surely it makes more sense to abstract the commands rather than the whole algorithm?
It’s an interesting subject, but one that seems to lack coverage…renderer abstraction.

Knackered,

That’s exactly why I want material factory and mesh/geometry factory functions to be part of my renderer driver API. The mesh loading code will use the API to push mesh data into the renderer, and the renderer will convert this mesh data to whatever is most appropriate (float->short, stream separation or FVF re-ordering, etc).

EDIT: regarding your stencil volume example, this could for example be done by drawing the mesh to the frame buffer using the stencil volume material. This means that a shader can extrude where possible, and software has to extrude where necessary. If there’s code sharing between implementations, then those functions are usually easy to abstract into a “helper” or “utilities” library or interface, used by several of the drivers.

[This message has been edited by jwatte (edited 11-05-2002).]

Sidenote:

Originally posted by licu:
Look at the new engines appearing, UT decides to go only on DirectX now, Carmack is working only on OpenGL etc.
UT2k3 does include a fully functional OpenGL renderer. Play with your ini settings if you have the game.

The initial version of the original UT was D3D, they released their OGL renderer later. Epic have never had OpenGL as their main API, so this is not some new development.

To switch to OpenGL go to your UT2003 folder, navigate to the system folder, and open up UT2003.ini

At the top you’ll see:

RenderDevice=D3DDrv.D3DRenderDevice
;RenderDevice=Engine.NullRenderDevice
;RenderDevice=OpenGLDrv.OpenGLRenderDevice

Switch to OpenGL by changing to this:

;RenderDevice=D3DDrv.D3DRenderDevice
;RenderDevice=Engine.NullRenderDevice
RenderDevice=OpenGLDrv.OpenGLRenderDevice

It worked for me and fixed some black BSP planes I was seing being drawn on some levels.

[This message has been edited by dorbie (edited 11-06-2002).]