OpenXR for eyetracker or handtracking only / Oculus SDK or OpenVR for rendering

Hi,

I was wondering if it is possible to use an OpenXR instance to get eye tracking or hand tracking data (from Meta Quest Pro for instance) while rendering on another thread or process using Oculus SDK or OpenVR (using OpenGL and running on Windows).
As far as I know (reading the posts in the forum from mid-2022) it seems that to access extensions on Meta Quest Pro I should do everything (rendering & eyetracking/handtracking) with OpenXR since session cannot be set without graphic API parameter filled but I would be grateful if someone can confirm or refute this.
I hope you can help me.

It’s not possible as AFAIK all current OpenXR runtimes only support single exclusive scene applications. There is no explicit support for running background applications in the OpenXR spec, but there doesn’t seem to be anything that says it’s prohibited. It would be up to Meta to implement support for background applications.

Note that Meta doesn’t officially support using the eye and hand tracking extensions on Windows. They are only exposed in developer mode for in-development use.

Thanks for your reply, I’ll try to modify my app to use OpenXR instead of Oculus SDK for tracking & rendering (and controller events …) … and then be able to get eyetracker data using the OpenXR session created.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.