I noticed that Meta’s OpenXR runtime (the one that comes with the Meta Horizon Link app) doesn’t seem to support some pretty fundamental extensions, such as
XR_EXT_hand_tracking
XR_EXT_hand_joints_motion_range
XR_EXT_hand_interaction
XR_EXT_hand_tracking_data_source
XR_EXT_uuid
and the more recent
XR_EXT_render_model
XR_EXT_interaction_render_model.
I’m using a Quest 3 to develop a PCVR app, and this was a bit shocking to me, given that SteamVR (that I usually use) supports all these without any issues. I mean, Meta’s is the “official” runtime for the headset that I’m using. Why can’t I have simple things?
What’s the reasoning behind this lack of support?
Thanks
Many extensions, including hand tracking are locked behind the developer mode on Link, which requires registering a developer account. Meta hasn’t made any moves to improving the support since Link was released. It seems Meta regards PC support as a legacy or developer feature.
Hmm so even if I register a developer account, I may get access to them, but then potential third parties (friends, customers etc) won’t be able to use such an app since they can’t be forced in any way to have a dev account. Is this correct?
I ran into this issue last year. Meta’s OpenXR runtime doesn’t expose several key extensions by default—like XR_EXT_hand_tracking, XR_EXT_render_model, and others. To access them, you need to go through Meta’s developer portal, request access to their SDK, and unlock your device. They use a gated system that tracks usage, which some devs (myself included) find intrusive.
When I applied to help design this, Meta told me they had no plans to implement those extensions—citing “security restrictions” or similar rhetoric. I wasn’t interested in building under those constraints, so I walked away and focused on building my own headers and engine. If you’re comfortable with C++ and OpenXR internals, you can stub out or emulate some of the extension behavior yourself. It’s not trivial, but it’s doable—and you won’t have anyone peering over your shoulder.
How OpenXR Extensions Work
OpenXR extensions are optional modules layered on top of the core API. They let vendors expose device-specific features without breaking compatibility. Here’s how a few of the Meta-relevant ones work:
Extension
Purpose
XR_EXT_hand_tracking
Provides skeletal hand joint data for real-time tracking.
XR_EXT_hand_joints_motion_range
Lets you query joint flexibility (natural vs constrained).
XR_EXT_render_model
Enables access to controller and device models for rendering.
XR_EXT_interaction_render_model
Adds dynamic interaction cues to render models.
XR_EXT_uuid
Assigns persistent identifiers to runtime objects.
These extensions are defined by the Khronos Group but must be implemented by the runtime vendor. If Meta doesn’t expose them, you’re stuck unless you switch runtimes or build your own support.
Boilerplate
// xr_ext_hand_tracking_mock.h
#ifndef XR_EXT_HAND_TRACKING_MOCK_H_
#define XR_EXT_HAND_TRACKING_MOCK_H_
#include <openxr/openxr.h>
#include <cstdint>
#ifdef __cplusplus
extern "C" {
#endif
// Extension identity
#define XR_EXT_HAND_TRACKING_SPEC_VERSION 4
#define XR_EXT_HAND_TRACKING_EXTENSION_NAME "XR_EXT_hand_tracking"
// Mock structure type constants (pick values that won't collide with real ones in your build)
#define XR_TYPE_HAND_JOINTS_LOCATE_INFO_EXT ((XrStructureType)1000019000)
#define XR_TYPE_HAND_JOINT_LOCATIONS_EXT ((XrStructureType)1000019001)
// Hand enum
typedef enum XrHandEXT {
XR_HAND_LEFT_EXT = 1,
XR_HAND_RIGHT_EXT = 2
} XrHandEXT;
// Joint enum (shortened; extend as needed)
typedef enum XrHandJointEXT {
XR_HAND_JOINT_PALM_EXT = 0,
XR_HAND_JOINT_WRIST_EXT = 1,
XR_HAND_JOINT_THUMB_METACARPAL_EXT = 2,
XR_HAND_JOINT_THUMB_PROXIMAL_EXT = 3,
XR_HAND_JOINT_THUMB_DISTAL_EXT = 4,
XR_HAND_JOINT_THUMB_TIP_EXT = 5,
// ... add remaining joints up to XR_HAND_JOINT_COUNT_EXT - 1
XR_HAND_JOINT_COUNT_EXT = 26
} XrHandJointEXT;
// Per-joint location
typedef struct XrHandJointLocationEXT {
XrBool32 isActive;
XrPosef pose;
float radius;
} XrHandJointLocationEXT;
// Input for locating joints
typedef struct XrHandJointsLocateInfoEXT {
XrStructureType type; // should be XR_TYPE_HAND_JOINTS_LOCATE_INFO_EXT
const void* next;
XrSpace baseSpace;
XrTime time;
XrHandEXT hand;
} XrHandJointsLocateInfoEXT;
// Output container for located joint info
typedef struct XrHandJointLocationsEXT {
XrStructureType type; // should be XR_TYPE_HAND_JOINT_LOCATIONS_EXT
void* next;
uint32_t jointCount;
XrHandJointLocationEXT* jointLocations;
} XrHandJointLocationsEXT;
// Function pointer type for xrLocateHandJointsEXT
typedef XrResult (XRAPI_PTR *PFN_xrLocateHandJointsEXT)(
XrSession session,
const XrHandJointsLocateInfoEXT* locateInfo,
XrHandJointLocationsEXT* locations
);
// Helper: attempt to load the function pointer via xrGetInstanceProcAddr
static inline PFN_xrLocateHandJointsEXT xrLoadLocateHandJointsEXT(XrInstance instance) {
PFN_xrLocateHandJointsEXT fn = NULL;
if (instance != XR_NULL_HANDLE) {
xrGetInstanceProcAddr(instance, "xrLocateHandJointsEXT", (PFN_xrVoidFunction*) &fn);
}
return fn;
}
#ifdef __cplusplus
}
#endif
#endif // XR_EXT_HAND_TRACKING_MOCK_H_
Thank you for the tips, although I don’t exactly understand how one can “build your own support”. For example, where am I going to get the real-time data for the hand joints from, in order to build something similar to a desired extension? Is there another API that provides them? Also, if any of that requires unlocking devices and developer modes, then what’s the benefit of doing that if it’s not going to work on any other device except yours?
Great questions – let me clarify what I meant by “build your own support.”
When an extension isn’t exposed by the driver, you can sometimes replicate the functionality by pulling the same data through another API that is available. For example, on Meta devices the hand joint data isn’t coming from OpenGL itself, it’s provided through the OpenXR runtime. OpenXR has standardized input paths for hand tracking, so you can query joint positions in real time there. Once you have that data, you can feed it into your own rendering pipeline as uniforms or SSBOs, effectively “building support” for features you wish existed as a native GL extension.
This doesn’t require unlocking the device or running in some special developer mode. It’s just using the APIs Meta already exposes. The benefit is portability: OpenXR is cross‑vendor, so the same hand‑tracking code works on other devices that implement the spec. You’re not locked into a single headset.
So the idea isn’t to hack around missing extensions, but to bridge the gap by combining APIs. GL handles the rendering, OpenXR provides the tracking data, and you stitch them together. That way you can prototype features even if the driver doesn’t advertise a dedicated extension.
Sorry, I still don’t understand. Let’s put rendering aside for a while, I’m fine with that aspect. So, in order to get the hand joint locations, I need to call xrLocateHandJointsEXT() . This is where the joints’ data will come from. In order to be able to call xrLocateHandJointsEXT(), I need to have the XR_EXT_hand_tracking extension enabled and supported by the runtime. But Meta’s OpenXR runtime does not support (expose) this extension/API, unless you enable a developer account (which isn’t a solution at all).
So if I can’t call xrLocateHandJointsEXT() at all, where am I going to pull the data from? What is the “another API that is available” ?
Meta’s runtime commonly exposes FB-prefixed hand-tracking extensions that provide hand poses, joint data, and related features. These FB extensions are vendor-specific but do supply the joint/mesh/aim data you’d otherwise get from the EXT API; the function names and structures differ from xrLocateHandJointsEXT so you must call the FB equivalents on Quest runtimes. The Khronos registry documents FB hand‑tracking mesh extensions and their APIs for reference [9]. If third‑party layers (e.g., Manus, Ultraleap) are installed, they may also expose hand tracking via either EXT or FB chains, depending on the layer and device setup [3][4].
Quick, concrete checklist (what to do)
At startup, enumerate instance extensions and check for XR_EXT_hand_tracking first. If present, use xrLocateHandJointsEXT.
If EXT is not present, check for FB hand extensions (e.g., XR_FB_hand_tracking, XR_FB_hand_tracking_mesh, XR_FB_hand_tracking_aim) and call the FB APIs those extensions define [1].
If neither is present, disable hand features or provide an alternate input path (controllers, gaze, etc.) so your app still runs on that runtime.
Example runtime check (pseudocode)
if (hasExtension("XR_EXT_hand_tracking")) {
// call xrLocateHandJointsEXT(...)
} else if (hasExtension("XR_FB_hand_tracking") ||
hasExtension("XR_FB_hand_tracking_mesh") ||
hasExtension("XR_FB_hand_tracking_aim")) {
// call Meta/FB hand-tracking APIs (xrLocateHandJointsFB or related FB calls)
} else {
// fallback: no hand tracking available
}
Khronos OpenXR registry (extension specs, incl. FB hand mesh) — authoritative API signatures and behavior [9].
Vendor / layer docs (examples: Ultraleap, Manus) showing how FB/EXT data can be exposed or translated by layers [3][4].
Engine docs (Unity, Godot) that show how runtime features map to engine APIs and which extensions they enable [6][8].
Final note
Since you can’t (or do not want ) to get a Meta dev account, you can still write robust code: detect available extensions at runtime and branch to the FB APIs when present. That’s the standard, production‑safe approach for supporting both Khronos EXT and vendor FB hand‑tracking on Quest devices. Sadly, Meta does not make this easy to test on their devices without getting an account with them and agreeing to their EULA and all that other BS they make you agree to.
Where are you getting XR_FB_hand_tracking from? It’s not an registered OpenXR extension, and not documented anywhere. The other FB extensions are designed as additions to the multivendor XR_EXT_hand_tracking extension, and won’t work without it enabled.
Indeed, I couldn’t find anything about it either. Also that last answer’s structure is a signature chatgpt answer, so XR_FB_hand_tracking looks like a hallucination at this point. And while I appreciate anyone who tries to help -some of these links are very useful indeed-, I don’t know what to think anymore about this subject.
It is possible in some cases to get data from alternative APIs, but in this case Meta seems to be actively trying to prevent those features from being used in production applications. If there is any alternative ways to get the data, the most likely way would be the legacy LibOVR API. I don’t know if it even supports hand tracking, or can be used alongside OpenXR. It would be risky to use anyway, since Meta could just shut down access to those features.
OpenXR API layers can provide extensions that are not supported by the runtime itself. They still need access from the same data though, so they won’t help if the data isn’t available anywhere in the first place. For hand tracking they are mostly used by third-party vendors to inject support for their hand tracking sources without the runtime having to support them.
You must enable a specific setting in the Meta Link PC app to see those extensions. They are supported but hidden by default. This is the only way to access the quest hardware without jailbreaking besides using mods which allow you to sideload extensions to expose this data.
Here is a related guide with pictures showing exactly where to find the toggle:
Note: the guides are not always exact to your specific situation, so you might need to adapt.
Also note that meta is recently changed this, so depending on your hardware and version of your device, it might work in PCVR or not. I personally use SteamLink to access this. But I believe unity and unreal still use the older interfaces so it might work with PCVR.
Neither of those extensions refer to any alternative to the mutlivendor extension, only additions that are dependent of it. The hand tracking extension was published fairly soon after OpenXR initially released, there is no reason why Meta would need an alternative one.