Shared Video not rendering in normal mode, but rendering in Desktop mode for Firefox android browser

25 Views Asked by At

I am implementing the Zoom Video SDK for Web using React Typescript. I run into the issue that when I use my android Firefox browser, and press the video icon, the video starts, but is not rendered in the provided canvas as feedback that the video is being shared over the internet.

However, when I switch to 'Desktop Mode' on the android Firefox browser, the video is rendered.

I have this code that does the video rendering on the canvas element or video element if a video element is present.

render video with canvas element:

const startVideoWithCanvas = useCallback(async() => {
        const startVideoOptions = { hd: true };
        if (mediaStream?.isSupportVirtualBackground() && isBlur) {
            Object.assign(startVideoOptions, { virtualBackground: { imageUrl: 'blur' } });
        }

        try {
            await mediaStream?.startVideo(startVideoOptions)
            .catch((err) => {
                message.error('Die Videoaufnahme konnte nicht gestartet werden.');
                setIsCameraDisabled(() => true);
            });
        } catch (error) {
            console.log(error);
        }

        if (!mediaStream?.isSupportMultipleVideos()) {
            console.log('selfVideoLayout', selfVideoLayout);

            if(!visibleParticipants) {
                return;
            }

            if(!videoRef)
            {
                return;
            }

            const index = visibleParticipants.findIndex(
                (user) => user.userId === zmClient.getCurrentUserInfo().userId
            );

            if (index > selfVideoLayout.length - 1 || index < 0) {
                return;
            }

            let dimension = selfVideoLayout[index];
            const { width: w, height: h, x: _x, y: _y } = dimension;
            const { height: canvasHeight } = canvasDimension;

            let width = w;
            let height = h;
            let y = canvasHeight - _y - height;
            let x = _x;

            console.log('w, h, x, y, _y, canvasHeight', width, height, x, y, _y, canvasHeight);

            try {
                mediaStream?.renderVideo(
                    videoRef.current, 
                    zmClient.getCurrentUserInfo().userId, 
                    width, 
                    height, 
                    _x, 
                    _y, 
                    3
                ).then(() => {
                    console.log('Started video with x at ', _x);
                    setHidSelf(true);

                    mediaStream?.adjustRenderedVideoPosition(
                        videoRef.current,
                        visibleParticipants[index].userId,
                        // zmClient.getSessionInfo().userId,
                        width,
                        height,
                        _x,
                        _y
                    ).catch(err => {
                        console.log(err);
                    });
                }).catch((err) => {
                    console.log(err);
                    message.error('Die Videoaufnahme konnte nicht gestartet werden.');
                    setIsCameraDisabled(() => true);
                });
            } catch (error) {
                console.log(error);
            }
        }
    }, [canvasDimension, isBlur, mediaStream, selfVideoLayout, videoRef, visibleParticipants, zmClient]);

render video with video element:

const startVideoWithVideoElement = useCallback(async() => {
        try {
            await mediaStream?.startVideo({ videoElement })
            .then(() => {
                setHidSelf(true);
            })
            .catch((err) => {
                console.log('camera error', err);
                message.error('Die Videoaufnahme konnte nicht gestartet werden.');
                setIsCameraDisabled(() => true);
            });
        } catch (error) {
            console.log(error);
        }

    }, [mediaStream, videoElement]);

select which element to render video and render:

const onCameraClick = useCallback(async () => {
        if (isStartedVideo) {
            await endVideoCapturing();
        } else {
            console.log('isAndroidBrowser()', isAndroidBrowser());
            console.log('isSupportOffscreenCanvas()', isSupportOffscreenCanvas());
            console.log('!mediaStream?.isSupportMultipleVideos()', !mediaStream?.isSupportMultipleVideos());
            console.log('!isSupportWebCodecs()', !isSupportWebCodecs());

            if (
                isAndroidBrowser() || 
                (
                    isSupportOffscreenCanvas() && 
                    !mediaStream?.isSupportMultipleVideos()
                )
            ) {
                if (videoElement) {
                    await startVideoWithVideoElement();                    
                } else {
                    console.log('Could not find video element');
                    await startVideoWithCanvas();
                }
            } else {
                await startVideoWithCanvas();
            }

            if (!mediaStream?.isCapturingVideo()) {
                if (mediaStream?.isCaptureForbidden()) {
                    message.error('Das Aufnehmen von Videos ist vom Benutzer verboten');
                    setIsCameraDisabled(() => true);
                    console.log('video forbidden');
                }

                message.error('Die Videoaufnahme konnte nicht gestartet werden.');
                setIsCameraDisabled(() => true);
                console.log('could not start video');
            }

            setIsStartedVideo(mediaStream?.isCapturingVideo() ? true : false);
        }
    }, [isStartedVideo, endVideoCapturing, mediaStream, videoElement, startVideoWithVideoElement, startVideoWithCanvas]);

This same code works fine for android Google Chrome browser, Desktop Google Chrome, and Desktop Firefox browser.

I can't seem to find where the problem comes from.

1

There are 1 best solutions below

0
tommygaessler On

Zoom Developer Advocate here.

I just tested Video SDK web 1.10.8, with the code from the Zoom Video SDK web start video documentation and it worked properly on Android 14 Firefox 123.1.0 (the latest versions at the time of the post). You can test via our deployed Video SDK web UI Toolkit demo.

<video id="my-self-view-video" width="1920" height="1080"></video>
<canvas id="my-self-view-canvas" width="1920" height="1080"></canvas>
#my-self-view-video, #my-self-view-canvas {
   width: 100%;
   height: auto;
}
if (stream.isRenderSelfViewWithVideoElement()) {
  stream.startVideo({ videoElement: document.querySelector('#my-self-view-video') }).then(() => {
      // video successfully started and rendered
   })
} else {
  stream.startVideo().then(() => {
      stream.renderVideo(document.querySelector('#my-self-view-canvas'), client.getCurrentUserInfo().userId, 1920, 1080, 0, 0, 3).then(() => {
         // video successfully started and rendered
      })
   })
}

Zoom Video SDK Android Firefox

By the way, we have a simpler method of rendering videos now using individual HTML elements:

<video-player-container></video-player-container>
video-player-container {
  width: 100%;
  height: 1000px;
}

video-player {
  width: 100%;
  height: auto;
  aspect-ratio: 16/9;
}
stream.startVideo().then(() => {
  stream.attachVideo(stream.getCurrentUserInfo().userId, RESOLUTION).then((userVideo) => {
    document.querySelector('video-player-container').appendChild(userVideo)
  })
})

Feel free to move the discussion over to the Zoom devforum.