Support for reading back the compositor output

I’d like to see the ability to get a texture of the runtime compositor output of each eye back into the application. OpenVR has this feature, and I’ve been playing around with using it to add a simple form of lighting to overlay applications, allowing them to be lit from the scene application and other overlays.

I know at least early on, this was a complete nonstarter in the WG, not even for use by the conformance tests (which is why some of them require a human to interact). It probably wouldn’t get much more traction today but you could certainly talk to some vendors about it, prototype it in Monado, etc. (Not only is Monado open source, it already has a recently-added texture read-back for screen capture and debugging)

You might have more uptake/interest in an extension that is more narrowly targeted for getting lighting parameters for overlay/secondary apps. I can see such a thing being pretty important in the future.

Thanks. I’m currently modifying the Unreal SteamVR plugin for rendering to overlays, and thought of reading back the output for lighting. I might have to implement an OpenXR version with support for the Monado overlay extension too.

I wonder what kind of light readback would work best for an OpenXR standard. There seem to be systems for light estimation available in AR implementations, that calculate and return point light parameters from the scene. Not really sure how they work, or effective they are. The solution I came up with was deprojecting the compositor output and temporally accumulating it into a cubemap texture.
https://twitter.com/rectus_sa/status/1542180592929980418
https://twitter.com/rectus_sa/status/1544792214274015236

Ryan, would you mind elaborating on how Monado works with different products? I know from memory some of it is reverse engineered, alternative drivers, etc. Is there any cases with low-level APIs/micro-drivers provided by the vendors that it leverages (I know I could browse the sources, but thought I’d ask instead!)

I think there’s a driver that uses the Leap Motion (Ultraleap) API. There’s also one that uses the Realsense development packages. Those are the only “vendor packages” I can think of offhand that it uses in the upstream codebase.

1 Like