OpenXR Tutorial - Question about Rendering Synchronization


I’m currently reading through the OpenXR tutorial, but I just had a few questions on how synchronization occurs between the graphics API (in my case, Vulkan), and the OpenXR runtime. In the tutorial, there appears to be a fence that is signaled after every EndRendering() and waited on during every BeginRendering().

In the RenderFrame() logic, this function only gets called once, so I believe we don’t wait on this fence before releasing the swapchain image. Is there a possibility of a data race if the runtime starts to consume this image before the rendering is actually complete in this case? Or is there something in the spec that implicitly synchronizes this?

It doesn’t seem to be explicitly stated anywhere, but unless I’m mistaken, the runtime has to handle the synchronization internally. The timing in VR applications can be very tight, so the runtime has to be able to make quick decisions on whether to preempt the currently rendering frame to composite and reproject the previous frame.

1 Like