I demand someone run with this idea !!

Okay, this is most extraordinary. Watch this video then think about what it would be like if people started integrated this technology into OpenGL applications.

Scratch that thinking, I demand someone implement some cool demos using this tech and post their links back on this forum!

Nice stuff! Unfortunately I don’t own a Wii :slight_smile: Why don’t you do some demos yourself?

I don’t own a wii either, but the remote can be bought as a stand alone accessory for much less than an actual Wii.

Now as to why am I not writing a demo using the technology/techniques described in the video … who’s to say I’m not? I just thought I’d pass the idea around to others an hope more people experiment with it as well. If enough people try it out it’s likely more cool ideas will be created.

Peace out brotha’

Cool! This would let you sneak around the corner in a first person game…

Brilliant. I already support head tracked ‘holographic’ stereo using expensive trackers such as intersense etc. Combining a tracked head with a proper skewed frustum stereo setup is incredibly thrilling to see, and now we can do the tracking for £15!
This will be a drop-in implementation…something for the new year.

Be sure to also look at these other Wii remote projects by the same author here. I think they are nearly equally as cool.

Something I think that makes this project different from other head tracking solutions is that this provides real-time positioning and orientation in XYZ space. The Wii remote provides data at a faster rate, at a greater angle (the remote has a 45 degree FOV), and a further distance than those other more expensive head tracking only tools. Finally there is no SDK to integrate. Sources are available here and a discussion is here.

Something I think that makes this project different from other head tracking solutions is that this provides real-time positioning and orientation in XYZ space.

You’ve never worked with a real VR installation, have you?

Indeed, what a load of bollocks.
IS900’s, Motionstar, Lazerbird’s, these are all much more stable and accurate than the wii remote. The Lazerbird actually uses roughly the same principle as this wii remote idea. Also, something like an IS900 doesn’t have the occlusion problem of the wii remote (and the lazerbird).
But this solution is so insanely cheap compared to all those that it’s actually a very significant development, in my opinion.

This is is interesting, if I can get the hardware (and get it to work) I may make a update to that Free-Cam plugin for GLIntercept.

If anyone has success converting the sample project over to mostly C++, let me know.

There is also free head tracking system FreeTrack which utilizes ordinary webcam with some modding. Or commercial TrackIR ($200) which has specialized hw and specializes itself on gaming.

I’ve had a little bash at integrating it with my system, and it’s pretty smooth tracking. The head orientation isn’t tracked very well, but that’s to be expected. Positional tracking is enough in most cases.

Thats really cool!

Wow, time to break out the psoc micro-controller and interface my mac via usb or ethernet. :slight_smile:

This is an OLD idea that’s just using the Wii tracker as a head tracker. We were doing this years ago with head tracking with stereo. The screen becomes a window into the world, the frustum calculation is trivial. I’ve advocated this for a long time (with or without tracking it pays to get your frustum correct).

The interesting thing here is that it’s dirt cheap and there are millions of them deployed, the only thing missing is a game that uses it and ships with a pair of LEDs on a headband.

Of course the experience has to be calibrated for screen size but it’s still trivial to calculate.

The only problem is that the effect is very limited on a standard screen. You have the disadvantage of not seeing much of the world through a 20" window, so an unrealistic frustum is often better.

Now, with a projector and a really large screen, this becomes interesting. VR installations can become really scary when you don’t see the edge of the image :wink:

I think it’s funny. Better than no head-tracking, at all. For professional uses there are better things, but for home-use, maybe for games (especially stealth-games like Splinter Cell could use this to peak around corners), i think it’s a nice idea. Though i doubt it will be used anytime soon and only if web-cams integrate the technology, so that people don’t need to buy a Wii-mote and build there own head-tracking devices.


It must look amazing on big screen, indeed I will try it definitelly next year on our stereo back projector in lab. We have also normal motion tracking system but its not wirelles and cost a little bit more :slight_smile:

You’d think it would look amazing, but when you actually see it working in a projector room where the image fully fills your peripheral vision, it looks just like a real room. It is so completely realistic that you don’t see it as a graphical trick, your brain just completely accepts it as being there.
Which is amazing in itself.

I have been playing around with his code a bit, but I cant really establish the sensation that the scene is coming out of the screen. It is tracking the movement just fine and the I believe the numbers I am providing are accurate. Do you guys have any ideas of what I am doing wrong?

basically, you need to build a frustum with its apex at the physical eye position (the wii sensor bar), and whose left,right,top,bottom planes pass through the edges of the gl window.

Yes, you also need to apply a viewing translation to position the eye in the virtual world to match the real world movement relative to the screen (no rotation). This should all match the location used to calculate the frustum.