Third handset or second gamepad? & other questions


I create VR game software using hand-rolled C and open source libraries. I am interested in making use of OpenXR, however, I can only use OpenXR if it offers equivalent functionality to existing APIs. I am reading the 0.90.1 spec and I have some questions.

  • There is a /user/hand/left and a /user/hand/right. The spec says: “If more than two hand-held controllers or devices are active, the runtime must determine which two are accessible as /user/hand/left and /user/hand/right.” This text seems to imply that if there are more than two handsets connected, the third cannot be exposed at all. However there are some existing setups which intentionally use a third controller. For example I have heard from people who use a third controller as a “camera” in a mixed-reality filming setup (you can find a blog post about this if you google for “Kert Gardner mixed reality”; the author had to plug in an additional Bluetooth dongle because the Vive hardware had a limit of two controllers, but by their description it sounds like OpenVR could handle it). Does OpenXR support handsets n > 2 at all?

  • There is a /user/gamepad. What if there are two gamepads? (For an example of why this might happen, consider a party game with asymmetric multiplayer, where one player has a VR headset and the other players are using controllers while watching the game on the computer monitor. Maybe the VR player is a giant swinging a hammer and the controller players are running around at their feet shooting arrows. For a game like this it would be completely normal to have two or more USB controllers.) Is OpenXR limited to a single gamepad?

  • In OpenXR, how do you detect device presence? For example say I simply want to know if /user/gamepad or /user/hand/left is currently connected. I spoke to someone who suggested I could attempt to create an action for the device and detect an error if this fails, but there is no single action which all devices respond to and also this is inconvenient for checking whether the device list has changed from frame to frame. In real-world use it is normal for devices to appear and disappear while a game is running. Is there a single test that allows checking “present” for any device regardless of kind? Is there an event which gets sent when a device disconnects or connects?

Depending on the answers to these questions, I might have some suggestions in the feedback thread. Thanks!


Hi @mcc!

To your three questions:

  • The OpenXR core spec does not define user paths for motion controllers beyond a primary left-hand controller and a primary right-hand controller. However, I would expect runtimes that do support tracking more controllers to introduce vendor extensions that enable such extended tracking by introducing vendor-specific paths. As we then agree across vendors on how to represent those additional controllers in a conformant way, those vendor extensions could then be promoted to cross-vendor extensions or ultimately promoted into a future revision of the core spec.
  • Similar story for gamepads: The OpenXR core spec defines paths that support exposing a single primary gamepad to the developer through the input actions system, with the choice of which gamepad is primary being up to the user of that runtime (e.g. based on which gamepad they picked up to use for navigating the launcher, etc.). Runtimes are then likely to experiment with extensions that allow you to access more gamepads or other USB controllers as part of the input action system. However, note that nothing stops you in the meantime from using other APIs like SDL2/XInput/etc. to access the system’s gamepad data as you would in a flat app - they just wouldn’t be able to participate as bindings in the input action system.
  • In OpenXR, you reason about which XR input devices are currently present and active by negotiating interaction profiles with the runtime. You first let the runtime know which interaction profiles (i.e. unique input device form factors) that your app knows about by calling xrSetInteractionProfileSuggestedBindings once for each interaction profile that you support. You can then find out which interaction profile is present on that system for a given top-level user path (e.g. /user/hand/left) by calling xrGetCurrentInteractionProfile. For example, if your app suggested bindings for both /interaction_profiles/oculus/touch_controller and /interaction_profiles/htc/vive_controller, your app would then know whether the user’s left-hand controller is an Oculus Touch controller or an HTC Vive controller. If the controller turns out to be a future controller that your app has never heard of, the current interaction profile will be whichever form factor is more similar to that controller, with the runtime providing an automatic or user-provided mapping to one of the controllers your app does know about. You can then check whether any individual action you created is active/inactive by checking the isActive field that you get back from xrGetActionState*. Note that you can only check whether an action is active and not a whole device like a motion controller, as users may use the runtime’s rebinding UI to remap some of your actions to one device and other actions to another device, including devices that may not have existed when your app was last updated. Checking isActive at the action level helps make sure that you can reason about the state of the devices that your user ultimately chose to use for a given action.