Is it possible to render a OVRCameraRig Eye to a Texture2D in Oculus Quest 2 and Unity?

342 Views Asked by At

I am attempting to stream out the current in game view over a WebRTC connection. My goal is to capture what the user is seeing as RGB 24BPP byte array. I am currently able to stream an empty Texture2D. I would like to populate the empty texture with the OVRCamerRig's current in game view.

I am not a strong Unity developer, but I assumed it might look something like this:

private Texture2D tex;
private RenderTexture rt;
private OVRCameraRig oVRCameraRig;

void Start() {
    // I only have 1 camera rig
    oVRCameraRig = GameObject.FindObjectOfType<OVRCameraRig>();
    tex = new Texture2D(640, 480, TextureFormat.RGB24, false); 
    rt = new RenderTexture(640, 480, 8, UnityEngine.Experimental.Rendering.GraphicsFormat.R8G8B8_SRGB);
}
        
public void Update() {
    oVRCameraRig.leftEyeCamera.targetTexture = rt;
    RenderTexture.active = rt;
    tex.ReadPixels(new Rect(0, 0, rt.width, rt.height), 0, 0);
    tex.Apply();
}
0

There are 0 best solutions below