Solution for robotics world to read finger movements from meta quest using python and not requiring unity, unreal, etc

In robotics we use a lot Meta Quest 3 VR headsets to train models: we read our hands doing work and we replicate those movements in robots.

Problem is that the solutions available always end up depending on Unreal, Unity, Visual Studio, and other windows-based stuff, which require extra people and computers to develop those solutions.

Also, the end models run usually on nvidia jetsons, so there is a need to export the trained models to them.

There is a library which gets movements from meta quests using a regular APK and a python serverwithout the need of unity, unreal, meta SDK, etc etc: GitHub - rail-berkeley/oculus_reader

Problem is that repo does not provide finger movement: just button triggers and general hand movement, although it does show hands

If the community could provide a similar solution to read fingers movements from a meta using python without unreal, unity, etc etc the whole robotics world would be VERY grateful