Hardware Tracker / Vive Tracker with OpenXR

Hello everyone,

I wonder what the current status of Hardware Trackers, such as the Vive Trackers is in regards to the OpenXR specification. If I see it correctly, the current specification does not cover trackers or provide an extension for them yet. We would however love to be able to make use of them in our projects.

Any news regarding tracker support?

Thank you

I would get in touch with the vendor (HTC) and let them know your use cases. They’re the ones who can submit and implement a vendor extension. If you post your use cases here as well, then the whole group can take them into account for future extensions.

I’m aware of two basic “classes” of common Vive Tracker use: body/skeleton tracking (adding additional tracked points to the user’s body for e.g. avatar animation), and virtual production (installation-specific applications with trackers on virtual cameras/displays, scenery, etc.). Do your usages fit in one of those, or are you in a third class of use cases?

Am not the OP, but I had the same question. The use case I am interested in is to use the Vive tracker to track an object like a bat, or racket. In fact I’d be happy enough if it behaved like another controller as I only need the position and rotation information, not any button actions. I’m using unity 2020.3 with OpenXR and the Vive Tracker 1.0 but from what I’ve read online HTC have not yet agreed the spec for OpenXR. So any information on a likely timeline or suggestions on the best approach for me to make this work would be appreciated. I’m considering going back to valve’s OpenVR if that’s the only option right now.

I need Vive trackers support in OpenXR in Unity for body/skeleton tracking.

I guess I will reach out to HTC then.

Our use-case is the second use-case that you describe. We have trackers on real screens and mirror them in VR (Unity game engine). We’d only need position and rotation info to achieve this. Currently using Vive Controllers instead as a workaround, but it does not provide the user experience we’re looking for.

1 Like

Yes, trackers are used for all kinds of things: Motion capture, training, location-based entertainment, robotics. I get that this isn’t the foremost use-case, but it’s a fairly important one.

Don’t worry, we know it’s important :slight_smile: Knowing the various use cases helps us as a working group understand how folks are actually using this multifunction device ability today. That helps us come up with better cross-platform/cross-vendor API shapes, so thanks for the feedback.

So what I think we have so far is the following, in approximate order of frequency:

  • body/skeleton tracking
  • virtual production
  • LBE: probably custom controllers and props? (Would you consider training to fall in this category? or is there some unique aspect to the usage of the trackers here?) I imagine this might be one case where the pogo-pins for inputs besides pose are actually used?
  • Robotics - is this for measuring movements? navigation? as a part of a control loop or external? This is a pretty broad topic - many of us are familiar with using robotics to calibrate/test tracking Login • Instagram , and a few are familiar with “gently armwrestling” for haptics (this video of mine is probably half robotics, half training in terms of applications: Combining Haptics with an Omnidirectional Mobile Robot - YouTube )

(I’ve done all of these except virtual production and mocap myself, so I know they exist, just trying to get more data than my own experiences)

Edit: Also helpful to know - for each use case, how important is “portability” to other hardware? Is the developer the same as the user? (e.g. virtual production, location-based) or will the app eventually become “legacy software” used beyond the support of the developer?

Body/Skeleton tracking is fairly common. I see this with enthusiasts, prosumers, as well as seasoned/entrenched professionals.

Virtual production is more for prosumers, and seasoned professionals. This appears to be quickly gaining traction.

LBE isn’t something I know as much about, and with COVID, I have no idea how this industry is doing right now, but certainly there are uses in this field.

For robotics, I’ve seen use cases where trackers are used for measuring movements / ground truth for calibrating internal sensors, sensor fusion, some small navigation tasks. I’ve seen it as an indirect control loop as well. I’ve also seen them used to deconflict multiple disparate coordinate systems.

An initial Vive Tracker extension has been released with the latest 1.0.20 patch release: The OpenXR Specification

Please have a look and try it out, and share your feedback with the working group so this sort of functionality can be improved going forward.