Hello, I’m working on a VR locomotion peripheral device and looking into integrating the device with Meta Quest headsets, but am interested in future support for these devices by OpenXR in general. Examples of such devices are the Kat Walk C, Virtuix Omni, Omni One, Freeaim VR Shoes, Cybershoes, and Cyberith Virtualizer.
After reading through the documentation, searching the forum, and running a few test application, I believe the core OpenXR API doesn’t have a way to support these devices. I’ll go into more details below. It would be fantastic if future versions of OpenXR could include support.
I’ll go into detail as to why I believe OpenXR doesn’t support this, but would be happy if anyone could point out something that I’m missing. I believe for this to work the OpenXR runtime would have to run the game application, and in the background there would be another application that sends/receives data to/from the locomotion peripheral. Or another possibility is to use an API layer, but I’ll talk about that later.
For using a background application, it would run headless, but right now the core OpenXR API doesn’t support this. There is an extension to support headless applications, but Meta doesn’t support that extension. You can see this discussion for further details: “OpenXR without rendering loop” (I’m unable to incude the actual link)
Even if Meta did support this extension, it’s not clear to me if the headless application would be able to send input actions to the game application. Can multiple OpenXR applications interact with each other? After reading the following post, it seems like OpenXR doesn’t support scenarios like this very well: “Better support for mutliple simultaneous applications”
I got the idea of using an API layer from this post: “Query HMD Location in any game”
An API layer could, in theory, intercept calls to xrGetActionState* functions and inject actions from the peripheral device. It could also retrieve other data it needs, such as HMD position and heading.
However, after reading more about API layers I don’t think this approach will work either. API layers are enabled in one of two ways. First, the system itself can have manifest files in certain locations where API layers will be enabled automatically by the loader. For Android, all of these locations are only accessible to root users. Second, the application itself can put the manifest files in the APK assets folder. So if an API layer was created that connects to a peripheral device, either the API layer would need to be implicitly loaded by putting the manifest file in a location only accessible to root users (Meta), or the game developer would need to add support to load the API layer. And for the first approach, its important to note that Meta has their own custom OpenXR loader, so it’s not clear how their custom loader enables API layers.
The Kat Walk C and Cybershoes has support for some Quest games, but in order to do so they had to work around these limitations by reaching out to individual developers to do one of two options. First, integrate an SDK into the game itself. Second, update the game to support Bluetooth gamepads, and then the peripheral device emulates a gamepad. The disadvantage of the latter approach is HMD position and rotation data can’t be sent to the peripheral device. Both approaches are not scalable.