OpenXR support for VR locomotion peripherals

Hello, I’m working on a VR locomotion peripheral device and looking into integrating the device with Meta Quest headsets, but am interested in future support for these devices by OpenXR in general. Examples of such devices are the Kat Walk C, Virtuix Omni, Omni One, Freeaim VR Shoes, Cybershoes, and Cyberith Virtualizer.

After reading through the documentation, searching the forum, and running a few test application, I believe the core OpenXR API doesn’t have a way to support these devices. I’ll go into more details below. It would be fantastic if future versions of OpenXR could include support.

I’ll go into detail as to why I believe OpenXR doesn’t support this, but would be happy if anyone could point out something that I’m missing. I believe for this to work the OpenXR runtime would have to run the game application, and in the background there would be another application that sends/receives data to/from the locomotion peripheral. Or another possibility is to use an API layer, but I’ll talk about that later.

For using a background application, it would run headless, but right now the core OpenXR API doesn’t support this. There is an extension to support headless applications, but Meta doesn’t support that extension. You can see this discussion for further details: “OpenXR without rendering loop” (I’m unable to incude the actual link)

Even if Meta did support this extension, it’s not clear to me if the headless application would be able to send input actions to the game application. Can multiple OpenXR applications interact with each other? After reading the following post, it seems like OpenXR doesn’t support scenarios like this very well: “Better support for mutliple simultaneous applications”

I got the idea of using an API layer from this post: “Query HMD Location in any game”

An API layer could, in theory, intercept calls to xrGetActionState* functions and inject actions from the peripheral device. It could also retrieve other data it needs, such as HMD position and heading.

However, after reading more about API layers I don’t think this approach will work either. API layers are enabled in one of two ways. First, the system itself can have manifest files in certain locations where API layers will be enabled automatically by the loader. For Android, all of these locations are only accessible to root users. Second, the application itself can put the manifest files in the APK assets folder. So if an API layer was created that connects to a peripheral device, either the API layer would need to be implicitly loaded by putting the manifest file in a location only accessible to root users (Meta), or the game developer would need to add support to load the API layer. And for the first approach, its important to note that Meta has their own custom OpenXR loader, so it’s not clear how their custom loader enables API layers.

The Kat Walk C and Cybershoes has support for some Quest games, but in order to do so they had to work around these limitations by reaching out to individual developers to do one of two options. First, integrate an SDK into the game itself. Second, update the game to support Bluetooth gamepads, and then the peripheral device emulates a gamepad. The disadvantage of the latter approach is HMD position and rotation data can’t be sent to the peripheral device. Both approaches are not scalable.

It’s not possible, and I think there’s a good reason for it. If a background app is able to spoof user inputs, this opens many security issues, such as malicious background apps able to steal private data, or perform certain unwanted operations (send inputs to force taking a screenshot of what you are doing, or send unwanted messages to other users).

It would also be a bit complicated for applications to handle several, possibly contradicting inputs, coming from different sources.

I don’t think this is truly a shortcoming of OpenXR. More about Android quite strict secure policies. Being able to inject an API layer into any application on Windows is a luxury, because the app model is a lot less secure than Android. I believe the fear from allowing this (and it’s a similar fear on platforms like PlayStation or Xbox) is to end up with clever hackers able to inject code bypassing things like entitlement checks, effectively making “every game can be free”.

Requiring app developers to “opt-in” by including the API layer in their package, also with all sorts of options (allowing user to enable/disable it at start up, and providing the necessary UX/dialogs in order to make the experience smooth), sounds like a reasonable ask.

BTW - I’m not advocating that today’s options are OK. As an API layer developer, I’ve experienced many of the pains created by them. I’m only hypothesizing why this isn’t an easy problem to solve, once you start truly considering the many many (bad) people out there who can use such feature to break a platform.

Thanks for you’re reply. I agree it’s not easy to solve, and the security concerns did cross my mind. Looking at how Steam handles 3rd party device drivers and overlays could help. A more limited API layer for peripheral devices could possibly work. I would just like a better solution than having to get every game developer to opt-in or have to implement support into their game.

Yeah it’s definitely not optimal. There is some progress and work, but mainly what is missing is someone from an actual stakeholder (somebody making such a device, for instance) championing the required extensions or other changes needed to make it happen. We know we’re missing use cases, but we are all limited in the time we can dedicate to OpenXR things.

Thanks. I could possibly try to champion the changes needed, as you put it. How would I go about doing that? Maybe the first step would be to learn more about how extensions to OpenXR are proposed and implemented, then propose an extension myself?