I could not find any place to post my question but this forum as all others are less professional and require paid membership.

Anyway I have created a graphics layer that’s implemented in both OpenGL and Direct3D. I use a row-major order matrix on application side, for example: v = M * v;

The coordinate system I use is Right-Handed.

The OpenGL implementation works just great. I used the same matrices for Direct3D. However, it just does not work. I tried all possible combination of matrix row-col-major ordering trying to guess what’s going on…Then I removed all D3DXMatrix****** calls and used the same matrices for OpenGL.

The application is really beyond a simple first person camera. The viewport transformations are very similar to Google-map…

My question, does Direct3D assumes something about matrices even if it’s right-handed projection matrix? So far I assume it’s a weird bug in Direct3D June 2010 SDK.

direct3d default coordinate is lefthand while opengl is righthand. Also it reads the matrix info row major where opengl is column major. Though that I think applys when you try to access an element of the matrix with [][] other than that they read the same in a 1 dimensional array (I usually store my matrix as a 1 dimensional array)

You can change direct3d coordinate to righthand. I can’t remember on the top of my head but it is easy to google it.

I’m really stuck into a problem where I’m pretty sure the matrices I use are the correct for both OpenGL and direct3D (right-handed system) with row-major, and then I multiply in the correct order inside the shaders. That worked perfect for OpenGL, but direct3D is unwilling to give me correct drawing…it just freaks out. I’m guessing either it’s assumes something internally about matrices so that it makes sure it will never work as RHS or it’s a bug never detected in game applications since it’s really not so obvious.

I will post some code and you can tell me if i’m full of garbage or not.

D3D works perfectly with RH matrixes so long as you use the -RH versions of the D3DX functions. One thing though is that the matrix multiplication order needs to be flipped.

Matrix is my matrix class I use for both graphics layers: GL and D3D, and all matrices are of the form: v’ = M * v’

The exact same matrices and logic I use for both GL and D3D.

The system is supposed to render a simple line grid in a viewport and a colored quad in the middle. When dragging the view, there’s a lagging behavior in mouse movement and the object and grid movement. When zooming in/out which is supposed to behave like Google-map (zoom to the point) it does not zoom to the point, it’s as if it were just scaling. In OpenGL it works just as supposed to be. However D3D behavior is awkward!

I actually use exactly the same orthographic projection matrix described in the OpenGL reference manual. It’s Right-Handed system in row-major order. And this works just perfect for OpenGL.
I assumed Direct3D does assume nothing about what type of coordinate system or matrix, since it should be hardware independent. Am I missing a flag/state setting in Direct3D to tell it I’m using RHS and not exactly the matrix in your SDK Doc

I tried the D3DX***** functions and I got almost the same weird results.

1 - Direct3D has assumption about the coordinate system that it screws up when using RHS. But it’s not working wither in LHS.

2 - Direct3D orthographic projection is not very well tested. Probably a bug sitting there and undetected by the game developers. But damn it. XNA works just great with the same transformation system I have.