Is velocity control style interaction (cf. 3D Connexion’s Space Mice) supported in OpenXR? How would devices supporting both position control (e.g. 3D motion tracking) and velocity control be supported? cf. Martin Sundin & Morten Fjeld (2009) Softly Elastic 6 DOF Input, International Journal of Human–Computer Interaction, 25:7, 647-691, DOI: [10.1080/10447310902964124]
In what format is the data output to the application? If it outputs a simulated position and velocity, it might be possible to have it work the same way as motion controllers. They use pose inputs that can be queried for position and rotation as well as linear and angular velocity.
The format for position and orientation is standard: vector and quaternion. So if I got it right a motion controller’s simulated velocity is proportional to the physical controller’s velocity (i.e. position control), whereas for 3D Connexion’s Space Mice simulated velocity is proportional to the physical controller’s deflection (i.e. velocity control). For my case with a device supporting both position and velocity control I would need a query to check which control mode is active, i.e. position or velocity control. How would I do that? The user would also need to be able to change the control mode from a GUI for instance, so there needs to be a protocol for control mode change. Has that been foreseen in OpenXR? Would I need to write an extension for it? Using XrActionStateBoolean?
So that would be an interesting question. The spacemouse actually reports 6 analog values for linear and angular displacement, which typically get treated by applications as velocities and integrated into poses. I would tend to think of it as being used this way to drive a simulated position and orientation of e.g. a controller, such as can be done with VRPN’s SpaceMouse drivers and the “AnalogFly” driver: I’ve used a spacemouse to do this in the past (though pre-openxr). If you wanted to switch between position and velocity control mode live, that would be a runtime-level control that would not be exposed to the application and thus out of scope for the OpenXR API: just do it in the runtime.
There is some consideration given in the spec to other velocity things more natively, such as omni treadmills, but given that no vendors of such devices are participating right now we haven’t filled out those concepts further. This would be easily addressed in one or more extensions to the spec.
That said, you would need a runtime that does such a thing. I do not yet know of any that support space mice, though e.g. it would be pretty easy to add a VRPN driver to Monado to support them.
Switching between position and velocity control mode live could also be done from another input device, such as a shortcut on a keyboard when using Blender for instance. For that case I suppose one would need an OpenXR API adaption.
No, I think it would just be entirely inside the runtime and transparent to the app