How to detect eye blink using Google vision API in android ?

10.9k Views Asked by At

i'm using the vision API for face detection, now i want to implement eye blink but still vision api detect eye blinking in image(photo) of a person(not live).

In addition, I am using a Tracker to keep track of the eye state over time, to detect the sequence of events that indicate a blink of left eye:

left eyes open -> left eyes closed -> left eyes open

The GraphicFaceTracker class is defined as below :

private class GraphicFaceTracker extends Tracker<Face> {
        private GraphicOverlay mOverlay;
        private FaceGraphic mFaceGraphic;
        private Context context ;

        GraphicFaceTracker(Context context, GraphicOverlay overlay) {
            mOverlay = overlay;
            this.context= context;
            mFaceGraphic = new FaceGraphic(overlay);
        }

        private final float OPEN_THRESHOLD = 0.85f;
        private final float CLOSE_THRESHOLD = 0.4f;

        private int state = 0;


        void blink(float value, final int eyeNo, String whichEye) {
            switch (state) {
                case 0:
                    if (value > OPEN_THRESHOLD) {
                        // Both eyes are initially open
                        state = 1;
                    }
                    break;

                case 1:
                    if (value < CLOSE_THRESHOLD ) {
                        // Both eyes become closed
                        state = 2;
                    }
                    break;

                case 2:
                    if (value > OPEN_THRESHOLD)  {
                        // Both eyes are open again
                        Log.i("BlinkTracker", "blink occurred!");

                        mCameraSource.takePicture(null, new CameraSource.PictureCallback() {
                            @Override
                            public void onPictureTaken(byte[] bytes) {
                                Bitmap bmp = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
                                Log.d("BITMAP", bmp.getWidth() + "x" + bmp.getHeight());
                                System.out.println(bmp.getWidth() + "x" + bmp.getHeight());
                            }
                        });
                        state = 0;
                    }
                    break;
            }


        }

        /**
         * Start tracking the detected face instance within the face overlay.
         */
        @Override
        public void onNewItem(int faceId, Face item) {
            mFaceGraphic.setId(faceId);
        }

        /**
         * Update the position/characteristics of the face within the overlay.
         */
        @Override
        public void onUpdate(FaceDetector.Detections<Face> detectionResults, Face face) {
            mOverlay.add(mFaceGraphic);
            mFaceGraphic.updateFace(face);

            float left = face.getIsLeftEyeOpenProbability();
            float right = face.getIsRightEyeOpenProbability();
            if (left == Face.UNCOMPUTED_PROBABILITY)  {
                // At least one of the eyes was not detected.
                return;
            }
            blink(left,0,"left");

            if(right == Face.UNCOMPUTED_PROBABILITY ){
                return ;
            }
        }
}

I have enabled "classifications" in order to have the detector indicate if eyes are open/closed :

FaceDetector detector = new FaceDetector.Builder(context)
            .setProminentFaceOnly(true) // optimize for single, relatively large face
            .setTrackingEnabled(true) // enable face tracking
            .setClassificationType(/* eyes open and smile */ FaceDetector.ALL_CLASSIFICATIONS)
            .setMode(FaceDetector.FAST_MODE) // for one face this is OK
            .build();

The tracker is then added as a processor for receiving face updates over time from the detector. For example, this configuration would be used to track whether the largest face in view has blinked:

Tracker<Face> tracker = new GraphicFaceTracker(this,mGraphicOverlay);
detector.setProcessor(new LargestFaceFocusingProcessor.Builder(detector, tracker).build());

But the above code detects blink in image of a person . But the image of a person cannot blink . How can I detect blink by camera ?

4

There are 4 best solutions below

0
On

From Face object you can get below probability.

 float leftOpenScore = face.getIsLeftEyeOpenProbability();
if (leftOpenScore == Face.UNCOMPUTED_PROBABILITY) {//left eye is open }else{//left eye closed }

 float leftOpenScore = face.getIsRightEyeOpenProbability();
if (leftOpenScore == Face.UNCOMPUTED_PROBABILITY) {//Right eye is open }else{//Right eye closed }

Now you can pass this value where you want to use.

3
On

I think that looks about right. If you associate the detector with a running CameraSource instance, like in this example:

https://developers.google.com/vision/android/face-tracker-tutorial

that would track the eye motion from the video camera. I also think you might change the onUpdate code a little to better decide the blink threshold:

    @Override
    public void onUpdate(FaceDetector.Detections<Face> detectionResults, Face face) {
        mOverlay.add(mFaceGraphic);
        mFaceGraphic.updateFace(face);

        float left = face.getIsLeftEyeOpenProbability();
        float right = face.getIsRightEyeOpenProbability();
        if ((left == Face.UNCOMPUTED_PROBABILITY) ||
            (right == Face.UNCOMPUTED_PROBABILITY)) {
            // One of the eyes was not detected.
            return;
        }

        float value = Math.min(left, right);
        blink(value);
    }
7
On

You can pass your detector to camera source and process blink detection from the surface view.

public class LivelinessScanFragment extends Fragment {

    SurfaceView cameraView;
    CameraSource cameraSource;
    final int RequestCameraPermissionID = 1001;
    FaceDetector detector;

       @Override
    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {

        switch (requestCode) {
            case RequestCameraPermissionID: {
                if (grantResults[0] == PackageManager.PERMISSION_GRANTED) {
                    if (ActivityCompat.checkSelfPermission(getActivity(), Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
                        return;
                    }
                    try {
                        cameraSource.start(cameraView.getHolder());
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                }
            }
        }
    }


    public LivelinessScanFragment() {
        // Required empty public constructor
    }


    @Override
    public View onCreateView(LayoutInflater inflater, ViewGroup container,
                             Bundle savedInstanceState) {

            // Inflate the layout for this fragment
            View rootView = inflater.inflate(R.layout.fragment_liveliness_scan, container, false);



            cameraView = (SurfaceView)rootView.findViewById(R.id.surface_view);

            detector = new FaceDetector.Builder(getActivity())
                .setProminentFaceOnly(true) // optimize for single, relatively large face
                .setTrackingEnabled(true) // enable face tracking
                .setClassificationType(/* eyes open and smile */ FaceDetector.ALL_CLASSIFICATIONS)
                .setMode(FaceDetector.FAST_MODE) // for one face this is OK
                .build();


            if (!detector.isOperational()) {
                Log.w("MainActivity", "Detector Dependencies are not yet available");
            } else {
                cameraSource = new CameraSource.Builder(Application.getContext(), detector)
                        .setFacing(CameraSource.CAMERA_FACING_FRONT)
                        .setRequestedFps(2.0f)
                        .setRequestedPreviewSize(1280, 1024)
                        .setAutoFocusEnabled(true)
                        .build();

                cameraView.getHolder().addCallback(new SurfaceHolder.Callback() {
                    @Override
                    public void surfaceCreated(SurfaceHolder surfaceHolder) {
                        try {
                            if (ActivityCompat.checkSelfPermission(Application.getContext(), Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {

                                ActivityCompat.requestPermissions(getActivity(),
                                        new String[]{Manifest.permission.CAMERA}, RequestCameraPermissionID);
                                return;
                            }
                            cameraSource.start(cameraView.getHolder());
                            detector.setProcessor(
                                    new LargestFaceFocusingProcessor(detector, new GraphicFaceTracker()));

                        } catch (IOException e) {
                            e.printStackTrace();
                        }
                    }

                    @Override
                    public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {

                    }

                    @Override
                    public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
                        cameraSource.stop();
                    }
                });


            }

            return rootView;
        }

    private class GraphicFaceTracker extends Tracker<Face> {

        private final float OPEN_THRESHOLD = 0.85f;
        private final float CLOSE_THRESHOLD = 0.4f;

        private int state = 0;


        void blink(float value) {
            switch (state) {
                case 0:
                    if (value > OPEN_THRESHOLD) {
                        // Both eyes are initially open
                        state = 1;
                    }
                    break;

                case 1:
                    if (value < CLOSE_THRESHOLD ) {
                        // Both eyes become closed
                        state = 2;
                    }
                    break;

                case 2:
                    if (value > OPEN_THRESHOLD)  {
                        // Both eyes are open again
                        Log.i("BlinkTracker", "blink occurred!");
                        state = 0;

                    }
                    break;
            }


        }

        /**
         * Update the position/characteristics of the face within the overlay.
         */
        @Override
        public void onUpdate(FaceDetector.Detections<Face> detectionResults, Face face) {

            float left = face.getIsLeftEyeOpenProbability();
            float right = face.getIsRightEyeOpenProbability();
            if ((left == Face.UNCOMPUTED_PROBABILITY) ||
                    (right == Face.UNCOMPUTED_PROBABILITY)) {
                // One of the eyes was not detected.
                return;
            }

            float value = Math.min(left, right);
            blink(value);
        }
    }


}
1
On

Here is a Github project open source eye blink detector for Android that detects eye blinks in real time in Android which is implemented on top of FaceDetectorApi