I captured a video using a Kinect Azure SDK device and extracted the RGB (Color) track using MKVToolNIX. I could playback this RGB track using Azure Kinect Viewer (v.1.4.1). However, when I encode a video using FFmpeg, the command is given below. I can not playback the video using Azure Kinect Viewer. I get a message:
Failed to open Recording
ffmpeg -i .\RGB.mkv -c:v libx264 -preset veryfast -crf 18 -c:a aac -b:a 128k RGG_Enc.mkv
I tried with different codecs (H.264, H.265, VP9, VP10, MJPEG) and different settings, but could not play the video in Azure Kinect Viewer. Can someone suggest to me what to do?
Here is the output of the FFprobe for the original video.
C:\Users\Ashutosh\Desktop\RGB_Encoding> ffprobe -i .\RGB.mkv
ffprobe version 6.0-full_build-www.gyan.dev Copyright (c) 2007-
2023 the FFmpeg developers
built with gcc 12.2.0 (Rev10, Built by MSYS2 project)
configuration: --enable-gpl --enable-version3 --enable-static --
disable-w32threads --disable-autodetect --enable-fontconfig --
enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --
enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --
enable-librist --enable-libsrt --enable-libssh --enable-libzmq --
enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2
--enable-libaribb24 --enable-libdav1d --enable-libdavs2 --enable-
libuavs3d --enable-libzvbi --enable-librav1e --enable-libsvtav1 --
enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2
--enable-libxvid --enable-libaom --enable-libjxl --enable-
libopenjpeg --enable-libvpx --enable-mediafoundation --enable-
libass --enable-frei0r --enable-libfreetype --enable-libfribidi --
enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-
libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-
ffnvcodec --enable-nvdec --enable-nvenc --enable-d3d11va --enable-
dxva2 --enable-libvpl --enable-libshaderc --enable-vulkan --
enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme
--enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb
--enable-libmp3lame --enable-libshine --enable-libtheora --enable-
libtwolame --enable-libvo-amrwbenc --enable-libilbc --enable-
libgsm --enable-libopencore-amrnb --enable-libopus --enable-
libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --
enable-libflite --enable-libmysofa --enable-librubberband --
enable-libsoxr --enable-chromaprint
libavutil 58. 2.100 / 58. 2.100
libavcodec 60. 3.100 / 60. 3.100
libavformat 60. 3.100 / 60. 3.100
libavdevice 60. 1.100 / 60. 1.100
libavfilter 9. 3.100 / 9. 3.100
libswscale 7. 1.100 / 7. 1.100
libswresample 4. 10.100 / 4. 10.100
libpostproc 57. 1.100 / 57. 1.100
[matroska,webm @ 0000018ef5f4e240] Could not find codec parameters
for stream 1 (Attachment: none): unknown codec
Consider increasing the value for the 'analyzeduration' (0) and
'probesize' (5000000) options
Input #0, matroska,webm, from '.\RGB.mkv':
Metadata:
title : Azure Kinect
encoder : libebml v1.4.4 + libmatroska v1.7.1
creation_time : 2023-08-04T11:52:43.000000Z
Duration: 00:00:40.03, start: 0.033000, bitrate: 126708 kb/s
Stream #0:0(eng): Video: mjpeg (Baseline) (MJPG / 0x47504A4D),
yuvj422p(pc, bt470bg/unknown/unknown), 2048x1536, SAR 1:1 DAR 4:3,
30 fps, 30 tbr, 1k tbn (default)
Metadata:
title : COLOR
BPS : 126696930
DURATION : 00:00:40.029000000
NUMBER_OF_FRAMES: 1201
NUMBER_OF_BYTES : 633943931
_STATISTICS_WRITING_APP: mkvmerge v76.0 ('Celebration') 64-bit
_STATISTICS_WRITING_DATE_UTC: 2023-08-04 11:52:43
_STATISTICS_TAGS: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES
Stream #0:1: Attachment: none
Metadata:
filename : calibration.json
mimetype : application/octet-stream
Unsupported codec with id 0 for input stream 1
This example video contains only one channel (RGB) from the original video. I am able to play this video using Azure Kinect Viewer.
It's possible that FFmpeg does not encode files that are compatible with Azure Kinetic Viewer.
Azure Kinect Viewer seems to expect: an MKV file containing YUV or JPEG data as frames
The fastest way to solve this problem is to recreate the structure of a known working file (that plays in Azure Kinetic Viewer) but replace those existing internal frames with your own custom frames (eg: providing your encoded video in single frames).
This means:
You will need to provide a testable file for further advice.
If not sure what format of RGB data:
(or you could check the output settings in your Kinect setup options)
The Hex editor shows you the file structure (in bytes). In the editor, find bytes that represent your recorded colour (eg: find the first frame):
FF D8 FF(for JPEG start code)00 FF 00).