I’m using this example as a simple playground: OpenXRSamples/SingleFileExample/main.cpp at master · maluoi/OpenXRSamples · GitHub
and I managed to connect XR_EXT_HAND_TRACKING extension and it works just fine for Meta Quest 3 through AirLink. I’m rendering small cubes for finger joints and I see them in VR exactly as expected.
So I went on to connect XR_FB_BODY_TRACKING extension. The first caveat seemed to be that it was not possible to use both HAND_TRACKING and BODY_TRACKING at the same time (makes sense - body should include fingers), so I disabled hand tracking.
I implemented my code following the example here:
The problem is that joint locations are very wrong.
Yesterday the location coordinates came insanely large, except for the first two joints. I saw only two joints at the floor level, and all the other ones were out of view.
Today I restarted Quest and switched between Steam and Oculus XR runtime a few times (to debug issues I mentioned in another thread). Then I returned back to Oculus XR and AirLink, ran my app and was surprised to see completely different results. This time all the joint locations are too small. All joint cubes are located on the floor.
However, the finger locations seem to be correctly positioned in relation to the hand and rotations are ok too, so I can see that my fingers are being moved more or less correctly. Also, the root bone seems to work properly, it follows my movements on the floor. It’s just that all positions (except fingers) are on a single vertical axis and dropped on the floor.
Am I missing something here? Should I somehow combine the joint locations with the skeleton? But then why do finger locations still work without the skeleton?
Or is this question too specific to Meta and I should ask on their forums instead?