While I think that not releasing all the necessary tools for development was very bad move, making 3D work in a hacked way in pretty much all DirectX games is actually better than what PhysX did, real hardware, no games.
I agree, you can count the games on a single hand that work without problems. Most of them have serious issues like shadows in screen space or post processing effects based on a single eye etc.
And $400 for 3D vision setup (including 22" screen) is not that bad actually.
But given the fact that a lot of OpenGL applications use custom matrix stuff and parameter passing, that’s kinda no go for us it seems.
So for now, all I can do is to accommodate to quad buffer approach and hope for the best? [do Nvidia/Ati people still attend these forums]
I understand I need a Quadro to do the testing?
Yes you need a Quadro, I got to do some testing here at work using a FX5800 on my 22" 3D Vision enabled display. It works great. But me being from Europe i had to solder up a 3D sync cable to connect the emitter to the Quadro because Nvidia decided that the european market does not need the cable and they could save some money on it while selling the 3D Vision set for the same figure price in Euro as they take Dollars :p.
Aside the criticism: i really think modern renderers can not be hacked to do correct off-axis projection stereo from the outside. The developer needs the tools to make the righ t decisions (what to render when to where). There is so much literature out there about how to do it right that they (ATi, Nvidia, Intel etc.) even do not need to produce something themselves besides enable quad buffer stereo to consumer level products.
This definitely is one thing OpenGL currently has over D3D, they have an API.
Maybe we should take this to the future OpenGL proposal threads, to make something like this required core, but we all know how this will end ;).