YUV_nv21 android preview Frame to opencv_core.Mat RGB image, then inverse the RGB mat to a YUV_nv21 Frame for recording

364 Views Asked by At

I've been searching for a relatively long time to get an answer to this question, and, believe me, I am very surprised that I haven't found anything yet. So with the risk of a huge oversight on my behalf in my search for an answer, I am appealing to whoever has the knowledge, time and kindness to at least push me to the right direction.

I confess that I am a beginner to opencv and computer vision in general, but I was able to proof test a concept of mine on a windows / python platform and would like to try it on an android environment.

Using the RecordActivity.java example as my basis, I would like in the following code to

Step1

Convert the yuv_nv21 android phone camera preview Frame (yuvImage) to an opencv_core.Mat RGB image (rgbImageMat) and then

Step 2

the opencv_core.Mat RGB image (rgbImageMat) convert it back to a yuv_nv21 android preview Frame (tmpyuvImage) that gets recorded insted of the original yuvImage Frame.

For the first step I am using the following conversion

opencv_imgproc.cvtColor(yuvImageMat,rgbImageMat,COLOR_YUV420sp2RGB)

Logically for the reverse step I am using

opencv_imgproc.cvtColor(rgbImageMat,yuvImageMat,COLOR_RGB2YUV_***)

I have tried various available COLOR_RGB2YUV_*** alternatives but for some reason the reverse conversion is not happening. What am I doing wrong ???? Does someone knows what this Color code should be ????

Thank You !!

The source code follows

   @Override
    public void onPreviewFrame(byte[] data, Camera camera) {

        Log.i(LOG_TAG,"onPreviewFrame");
        if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
            startTime = System.currentTimeMillis();
            return;
        }
        if (RECORD_LENGTH > 0) {
            int i = imagesIndex++ % images.length;
            yuvImage = images[i];
            timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
        }

        /* get video data */
        if (yuvImage != null && recording) {
            ((ByteBuffer)yuvImage.image[0].position(0)).put(data);
            /**********************************************************************************/
    //Convert YUV Frame to RGB Mat
            Log.i(LOG_TAG,"getPreviewFormat() : "+camera.getParameters().getPreviewFormat()); //=NV21
        //Set up the Frame <=> Mat conversion
            OpenCVFrameConverter.ToMat coverter = new OpenCVFrameConverter.ToMat();
            opencv_core.Mat yuvImageMat = new opencv_core.Mat();
        //yuvImage Frame to Mat
            Frame tmpyuvImage = new Frame();
            tmpyuvImage=yuvImage.clone();
            yuvImageMat = coverter.convert(tmpyuvImage);//yuvImage Frame to Mat
        //yuvImageMat to RGB Mat
            opencv_core.Mat rgbImage = new opencv_core.Mat();
            opencv_imgproc.cvtColor(yuvImageMat,rgbImage,CV_YUV420sp2RGB);//CV_YUV420sp2RGB,COLOR_YUV2RGB_NV21=COLOR_YUV420sp2RGB,COLOR_YUV2RGB_NV21,COLOR_YCrCb2RGB
    //Do the reverse conversion
            Log.i(LOG_TAG,"#### Begin  reverse conversion ####");
            opencv_imgproc.cvtColor(rgbImage,yuvImageMat,COLOR_BGR2YUV_YV12);//What is the inverse COLOR code to the CV_YUV420sp2RGB one 

        //yuvImageMat to  Frame
            tmpyuvImage = coverter.convert(yuvImageMat);//Mat to yuvImage

            /**********************************************************************************/
            if (RECORD_LENGTH <= 0)
                try {
                        Log.v(LOG_TAG,"Writing Frame");
                        long t = 1000 * (System.currentTimeMillis() - startTime);
                        if (t > recorder.getTimestamp()) {
                            recorder.setTimestamp(t);
                        }
                        if(addFilter) {
                            filter.push(yuvImage);
                            Frame frame2;
                            while ((frame2 = filter.pull()) != null) {
                                recorder.record(frame2);
                            }
                        } else {
                            Log.i(LOG_TAG,"yuvImage.imageChannels before rec :"+Integer.toString(yuvImage.imageChannels));                               
                            //recorder.record(yuvImage);//Argument has to be of type Frame
                            recorder.record(tmpyuvImage);
                        }
                    } catch (FFmpegFrameRecorder.Exception | FrameFilter.Exception e) {
                Log.v(LOG_TAG,e.getMessage());
                e.printStackTrace();
            }
        }
    }
0

There are 0 best solutions below