Hi everyone, I’m not sure if this is the right place to share this kind of thing, but I thought it might be useful to some people here ![]()
AetherVR is a proof of concept I’ve been working on for some time now that allows you run OpenXR apps without VR hardware. Instead, it tracks your head and hand movements using your webcam and converts them into headset and controller inputs for applications. Hand gestures such as “Pinch” or “Fist” are translated into controller button presses such as “Trigger” or “Squeeze”.
This way, you can interact with a VR environment as if you were in it, even though everything is only displayed on your screen. You can use this tool without modifying your application because it provides a “fake” OpenXR runtime that generates these virtual inputs even though no actual VR device is connected.
This is primarily aimed at developers that don’t want to put on their VR headset every time they make a small change to their app. Instead, they can press “Run” in their game engine and directly interact with the app from their engine’s editor.
Another use case is running VR games that don’t require a VR headset to be playable. You can actually play some SteamVR games with AetherVR by using OpenComposite, a translation layer that forwards OpenVR calls to OpenXR runtimes. Games where you stand in one place and grab things around you like Job Simulator work quite well, whereas more complex games like Half-Life: Alyx are basically unplayable.
The tool can be downloaded from this GitHub Repository. The OpenXR runtime works best with Direct3D 11 apps, but there is also some support for Vulkan (Windows and Linux) and Metal (macOS).
Please let me know what you think!