How would one go about mirroring or cloning the WebXR 'immersive-xr'
view from a HMD like the VIVE or Oculus in the browser using the same WebGL canvas
?
There is much discussion about copying the pixels to a texture2D, then applying that as a render texture, or completely re-drawing the entire scene with an adjusted viewTransform
. These work well if you are rendering a different view, such as a remote camera or 3rd person spectator view, however both are a waste of resources if one only wants to mirror the current HMD view on the desktop.
Self answered below as there was no solid answer when I ran into this and I'd like to save future devs the time. (Especially if they're not all to savvy with WebGl2
and WebXR
)
Note, that I'm not using any existing frameworks for this project for 'reasons'. It shouldn't change much if you are, you'd just need to perform the steps at the appropriate place in your library's render pipeline.
The answer is delightfully simple as it turns out, and barely hits my fps.
{xrCompatible: true, webgl2: true, antialias: false}
spectateBuffer
immersive-xr
layer as usual in yourxrSession.requestAnimationFrame(OnXRFrame);
callbackOnXRFrame
method, implement a call to draw the spectator view. I personally used a boolshowCanvas
to allow me to toggle the spectator mirror on and off as desired:Note: I'm only showing one of my HMD eye views as the spectator view, to show both you would need to store a spectator framebuffer per eye and blit them together side by side.
I hope this save's future googlers some pain.