I wonder what the current status of Hardware Trackers, such as the Vive Trackers is in regards to the OpenXR specification. If I see it correctly, the current specification does not cover trackers or provide an extension for them yet. We would however love to be able to make use of them in our projects.
Any news regarding tracker support?
I would get in touch with the vendor (HTC) and let them know your use cases. They’re the ones who can submit and implement a vendor extension. If you post your use cases here as well, then the whole group can take them into account for future extensions.
I’m aware of two basic “classes” of common Vive Tracker use: body/skeleton tracking (adding additional tracked points to the user’s body for e.g. avatar animation), and virtual production (installation-specific applications with trackers on virtual cameras/displays, scenery, etc.). Do your usages fit in one of those, or are you in a third class of use cases?
Am not the OP, but I had the same question. The use case I am interested in is to use the Vive tracker to track an object like a bat, or racket. In fact I’d be happy enough if it behaved like another controller as I only need the position and rotation information, not any button actions. I’m using unity 2020.3 with OpenXR and the Vive Tracker 1.0 but from what I’ve read online HTC have not yet agreed the spec for OpenXR. So any information on a likely timeline or suggestions on the best approach for me to make this work would be appreciated. I’m considering going back to valve’s OpenVR if that’s the only option right now.
I need Vive trackers support in OpenXR in Unity for body/skeleton tracking.
I guess I will reach out to HTC then.
Our use-case is the second use-case that you describe. We have trackers on real screens and mirror them in VR (Unity game engine). We’d only need position and rotation info to achieve this. Currently using Vive Controllers instead as a workaround, but it does not provide the user experience we’re looking for.
Yes, trackers are used for all kinds of things: Motion capture, training, location-based entertainment, robotics. I get that this isn’t the foremost use-case, but it’s a fairly important one.
Don’t worry, we know it’s important Knowing the various use cases helps us as a working group understand how folks are actually using this multifunction device ability today. That helps us come up with better cross-platform/cross-vendor API shapes, so thanks for the feedback.
So what I think we have so far is the following, in approximate order of frequency:
- body/skeleton tracking
- virtual production
- LBE: probably custom controllers and props? (Would you consider training to fall in this category? or is there some unique aspect to the usage of the trackers here?) I imagine this might be one case where the pogo-pins for inputs besides pose are actually used?
- Robotics - is this for measuring movements? navigation? as a part of a control loop or external? This is a pretty broad topic - many of us are familiar with using robotics to calibrate/test tracking Login • Instagram , and a few are familiar with “gently armwrestling” for haptics (this video of mine is probably half robotics, half training in terms of applications: Combining Haptics with an Omnidirectional Mobile Robot - YouTube )
(I’ve done all of these except virtual production and mocap myself, so I know they exist, just trying to get more data than my own experiences)
Edit: Also helpful to know - for each use case, how important is “portability” to other hardware? Is the developer the same as the user? (e.g. virtual production, location-based) or will the app eventually become “legacy software” used beyond the support of the developer?
Body/Skeleton tracking is fairly common. I see this with enthusiasts, prosumers, as well as seasoned/entrenched professionals.
Virtual production is more for prosumers, and seasoned professionals. This appears to be quickly gaining traction.
LBE isn’t something I know as much about, and with COVID, I have no idea how this industry is doing right now, but certainly there are uses in this field.
For robotics, I’ve seen use cases where trackers are used for measuring movements / ground truth for calibrating internal sensors, sensor fusion, some small navigation tasks. I’ve seen it as an indirect control loop as well. I’ve also seen them used to deconflict multiple disparate coordinate systems.
An initial Vive Tracker extension has been released with the latest 1.0.20 patch release: The OpenXR Specification
Please have a look and try it out, and share your feedback with the working group so this sort of functionality can be improved going forward.
I’m working right now on the implementation of this extension and got it working for 1 single HTC Vive Tracker using example 1 of the specification.
I would also like to support multiple trackers. But this I was not able to do.
My guess was that I can target different trackers with different roles. So I added different roles to the actionSuggBindings vector. But only for the first entry of this vector the action is successfully bound.
Is there a way to target multiple trackers with this extension?
Hi markusrapp, does it mean it’s already possible to use Vive trackers in Unity? I have not upgraded my project from SteamVR to OpenXR library because it lacked Vive trackers support.
@joshua321 in Unity I don’t know. Not using Unity for VR at the moment and I’m not a Unity developer. I guess that it will take some time until trackers are fully supported in Unity through OpenXR. Not all OpenXR runtimes supporting it yet fully. For SteamVR there was just the Beta version of it released and it doesn’t look like that this extension is fully supported yet. Probably also a reason why multiple trackers don’t work yet, with SteamVR using the extension XR_HTCX_vive_tracker_interaction in OpenXR.
Any news on this? For my new project I would like to use the XR system in Unity, including the XR Input after working with SteamVR and the legacy VR system for years…
My use case is fullbody tracking with FinalIK.
In this thread on the Unity forums Valve and Unity engineers are working on this
Looks like its nearly finished
I found a major bug with using Vive trackers in OpenXR with Unity + SteamVR where the screen goes black. I am trying to post about the issue here on the forum also to raise the profile of it. Is there anyone who can verify this in the OpenXR team?
I have tested Vive Cosmos + Vive trackers and also Oculus Quest 2 + Vive trackers and both times I just get a black screen
The “OpenXR Team” is the OpenXR working group, who works on the spec. There’s no overarching implementation team - you would want to get in touch with unity and/or steamvr directly for this issue.