Will an upcoming OpenXR spec provide an interface for playing a buffer of haptic feedback?
In the provisional spec, a base
XrHapticVibration struct is available for requesting that the runtime play haptics. However, it only gives room to specify the length and frequency of the vibration. If instead an application wants to play a clip of haptic feedback, I’d expect it would need to do something hacky like
xrApplyHapticFeedback(...); nanosleep(...); xrApplyHapticFeedback(...); nanosleep(...); etc., breaking the clip into samples and blocking the thread in between them. And since playing the clip could easily take longer than the time to the next frame, this might have to be done on a separate thread. With a buffered interface, the application could just fire off
xrApplyHapticFeedback(...) once with an
XrHapticVibrationBuffer and forget.
The Oculus SDK already provides buffered haptic playback for its Touch controllers, and Unity has provided a
HapticCapabilities.supportsBuffer flag since its 2018.3 release. An OpenXR interface could be particularly helpful for providing a common interface for current and future devices that permit audio-rate control, like haptic vests à la Subpac and Tactsuit. If I understand correctly, this could be an extension or work with the future device plugin interface. I thought it could benefit from a standard interface, though, since from an application’s point of view, a buffered haptic device just looks like a speaker/DAC it submits buffers to.