Any way to make a similar method to OpenVR’s Submit_TextureWithPose?

Hi everybody!

I have experience in 3d programming (OpenGL/D3D), also SteamVR and OpenVR, but I’m kinda new in OpenXR.

I’m adding OpenXR support to an existing application.

For doing so:

  • I capture the framebuffer from the application once it has been rendered
  • For each rendered framebuffer I have the headset’s XrPosef with which it was rendered at the moment of xrWaitFrame xrBeginFrame xrEndFrame
  • I render this framebuffer as a RTT to a quad, but I’d like to add a relative offset between the actual headset pose and the one with which it was rendered.

Now I have something kinda working the following way:

  • xrSpace: appSpace of type XR_REFERENCE_SPACE_TYPE_VIEW
  • xrSpace: poseSpace of type XR_REFERENCE_SPACE_TYPE_LOCAL
  1. Before starting to render a new frame I calculate predicted pose at next frame using poseSpace (LOCAL)
  2. At the moment of sending the frame to render into OpenXR I recalculate headset position at xrWaitFrame’s predictedTime and poseSpace (LOCAL), then calculate the relative position between the original rendered pose and the new one and adjust the quad position with projection + view matrices.
  3. For XrViewLocateInfo I use the appSpace (VIEW)

As said it works, but it has a minimal vibration. And I was wondering if there is a more elegant way to do it.

My question is:

Is there any other way to add this relative position instead of doing it myself in the graphic plugin with a shader? Maybe creating a XrReferenceSpaceCreateInfo or similar?

The goal is to make something similar to OpenVR’s Submit_TextureWithPose:

vr_compositor_->Submit(vr::Eye_Right, &rightEyeTexture, nullptr, vr::Submit_TextureWithPose);

Thank you very much in advance for any help!

Kind regards!!

This is already how submission works with xrEndFrame(). The XrCompositionLayerProjectionView already takes the XrPosef that you used for rendering. This is strictly equivalent to the OpenVR Submit_TextureWithPose that you mentioned. This will take care of doing late-stage reprojection with the latest headset pose.

I must be missing something in your question…

1 Like

Thank you very much for your answer :slight_smile:

Thanks to it I’ve seen the light!

Basically the problem is that the application I’m adding OpenXR to was doing the rendering outside of xrWaitFrame, xrBeginFrame, xrEndFrame. It’s kind of a complex application, with several threads. I’m not getting into detail but it was done more or less like this:

Thread 1 (game loop thread) Thread 2 (render thread)
xrLocateViews from LOCAL Space Update head pose in game
Render game (do game D3D11 primitives)
xrWaitFrame
xrBeginFrame
xrLocateViews from VIEW Space for XrCompositionLayerProjectionView
… graphicsPlugin->RenderView(…) …
xrEndFrame

After some refactoring finally I’ve left it like this:

Thread 1 (game loop thread) Thread 2 (render thread)
Update head pose in game xrLocateViews from LOCAL Space
xrWaitFrame
xrBeginFrame
xrLocateViews from VIEW Space for XrCompositionLayerProjectionView
Render game (do game D3D11 primitives)
… graphicsPlugin->RenderView(…) …
xrEndFrame

I’ve removed all my manual shader view projection matrix for doing the quad repositioning and voilà! It works like a charm :slight_smile:

Of course the two threads are synchronized with mutex where needed. It’s a little bit more complex than what I said, but in general that’s the idea.

I think my manual quad repositioning worked, but the small flickering was due to small error margin between OpenXR’s reprojection and my “own made” reprojection.

Thank you very much! I appreciate it a lot!!

(And thanks also for your work in OpenXR Toolkit. You are a legend!!)

Kind regards!

1 Like

Ha this makes a lot more sense! Your new breakdown looks good!!

Glad it’s solved.

1 Like

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.