Swapchain memory access for graphics API interop

I’m trying to implement a graphics API interop in my OpenXR API layer, where I take the Vulkan-created swapchain image the application has already rendered to, and pass it to my Direct3D11 renderer as a ID3D11Resource. The problem is that vkGetMemoryWin32HandleKHR requires inputting the VkDeviceMemory object the image is bound to, and xrAcquireSwapchainImage only outputs the vkImage itself.

Is it possible to get the VkDeviceMemory from the image object, or is there any other way to get a shared resource of the swapchain?

I don’t think there is any guarantee that the OpenXR runtime created the swapchain images with sufficient sharing properties anyway (things like VkExportMemoryAllocateInfo and friends). Unlike D3D, I don’t think there is a proper way to trace back things used at resource creation (like that VkDeviceMemory you are talking about).

One thing I’ve noticed experimentally (with D3D app devices, but I think the same would apply to Vulkan app devices since most runtimes out there use D3D for composition on Windows) is that many runtimes don’t give you nicely shareable textures out of swapchains in the first place. I only had unconditional success with WMR.
Varjo and Oculus all gave me textures that failed to export as handles or failed to import later (even when they had the correct creation flags, I suspect this is because most of these textures are already shared from somewhere else and resharing them has all sorts of restrictions). I don’t remember if I tried SteamVR.

So the safest and most portable approach is to use an intermediate texture anyway (that you create shareable either on the D3D11 or Vulkan side and then share to your other device). I had code in my API layer to try avoiding the intermediate texture when it looked possible to share, but again I ended up adding a quirk to only ever do that on WMR since it failed on all other runtimes.

1 Like

Thanks! This is for my SteamVR passthrough layer, and I think SteamVR serves shared textures at least.

Using an intermediate texture sounds might be the only viable option. I have a basic Vulkan renderer at the moment, but maintaining it is proving to be too much work when adding new features.

I think SteamVR will give you textures shared from another API, for example our WMR driver uses DX12, and we export textures to the SteamVR compositor, which in turn gives you these textures. Whether these are exportable again to a 3rd API (DX11), I am really not sure :slight_smile:

Btw I have been dealing with the same challenge as you, and I’ve decided to go 100% DX11 for my drawing code in the API layer, then create interop for DX12 and Vulkan app (swapchain and fences only). This helped me keep the code simpler. My GitHub has this “composition framework” doing all this. I haven’t merged Vulkan support publicly, but I have a working version. Of course the Vulkan interop code is the most complex compared to DX12, but that’s expected. For Vulkan I just always assumed I need to use an intermediate texture, because of the (lack of) assumptions as described in my first message.

Good luck!

1 Like

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.