Render to mirror window in 2D monitor

Are there any examples or best practices on how to mirror the rendered image into a 2D monitor?
Specifically with Direct3D, but I guess any API would be useful.

Thanks in advance.

Are you talking something like this?:

Thanks for the reply!

I think I didn’t explain myself correctly. My question was related to how to present the image to the monitor on PC (duplicate, not really “mirror”) in the most efficient way, possibly with a custom FOV and aspect ratio, after it has been submited to the VR device (HMD).

One option you might have considered is to render to an offscreen FBO, and then do 2 blits (glBlitFramebuffer()): 1 for the VR device and one for the mirror display. An advantage of this is that you can use different sizes and have the blit do a different resize for each. Or render to the display and the blit to the VR device, leaving you with only 1 blit.

In the case of rendering to an MSAA FBO, you may want to consider adding another blit first to downsample the FBO before doing the two resize blits, rather than using EXT_framebuffer_multisample_blit_scaled for each. This to avoid doing the downsample twice.

Thank you for the ideas!
Indeed rendering to an offscreen buffer seems like the best option. I wasn’t sure if it was recommended since I didn’t find any example.

Oh, it’s definitely recommended for any non-trivial rendering. And it can provide you more options than when rendering directly to an OS window.

There’s no established OpenXR-specific way of doing this kind of rendering, though many runtimes provide this functionality themselves. It has been discussed as a possible extension, but I don’t think there’s anything public other than the Microsoft vendor extension for “first person observer” which they’ve used for similar purposes (particularly to composite rendered content onto camera views from the HoloLens2) : The OpenXR Specification