I have experience in 3d programming (OpenGL/D3D), also SteamVR and OpenVR, but I’m kinda new in OpenXR.
I’m adding OpenXR support to an existing application.
For doing so:
I capture the framebuffer from the application once it has been rendered
For each rendered framebuffer I have the headset’s XrPosef with which it was rendered at the moment of xrWaitFramexrBeginFramexrEndFrame
I render this framebuffer as a RTT to a quad, but I’d like to add a relative offset between the actual headset pose and the one with which it was rendered.
Now I have something kinda working the following way:
xrSpace: appSpace of type XR_REFERENCE_SPACE_TYPE_VIEW
xrSpace: poseSpace of type XR_REFERENCE_SPACE_TYPE_LOCAL
Before starting to render a new frame I calculate predicted pose at next frame using poseSpace (LOCAL)
At the moment of sending the frame to render into OpenXR I recalculate headset position at xrWaitFrame’s predictedTime and poseSpace (LOCAL), then calculate the relative position between the original rendered pose and the new one and adjust the quad position with projection + view matrices.
For XrViewLocateInfo I use the appSpace (VIEW)
As said it works, but it has a minimal vibration. And I was wondering if there is a more elegant way to do it.
My question is:
Is there any other way to add this relative position instead of doing it myself in the graphic plugin with a shader? Maybe creating a XrReferenceSpaceCreateInfo or similar?
The goal is to make something similar to OpenVR’s Submit_TextureWithPose:
This is already how submission works with xrEndFrame(). The XrCompositionLayerProjectionView already takes the XrPosef that you used for rendering. This is strictly equivalent to the OpenVR Submit_TextureWithPose that you mentioned. This will take care of doing late-stage reprojection with the latest headset pose.
Basically the problem is that the application I’m adding OpenXR to was doing the rendering outside of xrWaitFrame, xrBeginFrame, xrEndFrame. It’s kind of a complex application, with several threads. I’m not getting into detail but it was done more or less like this:
Thread 1 (game loop thread)
Thread 2 (render thread)
xrLocateViews from LOCAL Space Update head pose in game
Render game (do game D3D11 primitives)
xrWaitFrame
xrBeginFrame
xrLocateViews from VIEW Space for XrCompositionLayerProjectionView
… graphicsPlugin->RenderView(…) …
xrEndFrame
After some refactoring finally I’ve left it like this:
Thread 1 (game loop thread)
Thread 2 (render thread)
Update head pose in game
xrLocateViews from LOCAL Space
xrWaitFrame
xrBeginFrame
xrLocateViews from VIEW Space for XrCompositionLayerProjectionView
Render game (do game D3D11 primitives)
… graphicsPlugin->RenderView(…) …
xrEndFrame
I’ve removed all my manual shader view projection matrix for doing the quad repositioning and voilà! It works like a charm
Of course the two threads are synchronized with mutex where needed. It’s a little bit more complex than what I said, but in general that’s the idea.
I think my manual quad repositioning worked, but the small flickering was due to small error margin between OpenXR’s reprojection and my “own made” reprojection.
Thank you very much! I appreciate it a lot!!
(And thanks also for your work in OpenXR Toolkit. You are a legend!!)