CIDetector either not detecting, or detecting in odd places

1.6k Views Asked by At

I am practicing using some swift 2 and have run in to some difficulties in using the CIDetector.

I have an app which has a series of pictures in; three different rectangles and three different photos of people/groups. I have been just trying the CIDetector on these images to see what is recognised. The most success I have has is with the faces - however the face it's recognising are in very weird places on the image.

Here is a rectangle image I tested, along with it's output: Rectangle Test

And here is a face image: enter image description here

Here is my detection code:

    ///does the detecting of features
        func detectFeatures() -> [CIFeature]? {

            print("detecting")

            let detectorOptions: [String: AnyObject] = [CIDetectorAccuracy: CIDetectorAccuracyHigh, CIDetectorImageOrientation: 6]
            var detectorType    : String

            switch currentPhoto.currentType {
            case .RECTANGLES:
                detectorType = CIDetectorTypeRectangle
            case .FACES:
                detectorType = CIDetectorTypeFace
            default:
                print("error in phototype")
                return nil

            }

            let detector = CIDetector(ofType: detectorType, context: nil, options: detectorOptions)
            let image = CIImage(image: imageViewer.image!)

            guard image != nil else{
                print("image not cast to CIImage")
                return nil
            }

            return detector.featuresInImage(image!)
        }

Where the stars are added on:

for f in features!{
            print(f.bounds.origin)
            imageViewer.layer.addSublayer(createMarker(f.bounds.size, location: f.bounds.origin))
        }

and where the stars are created:

///Creates a star on a layer and puts it over the point passed in
    func createMarker(size: CGSize, location: CGPoint) -> CALayer{

        print("The marker will be at [\(location.x),\(location.y)]")
        //doesn't appear in correct place on the screen
        let newLayer = CALayer()
        newLayer.bounds = imageViewer.bounds
        newLayer.frame = CGRect(origin: location, size: size)
        newLayer.contents = UIImage(named: "star")?.CGImage
        newLayer.backgroundColor = UIColor(colorLiteralRed: 0.0, green: 0.0, blue: 0.0, alpha: 0.0).CGColor
        newLayer.masksToBounds = false

        return newLayer
    }

Has anyone encountered anything like this before?

1

There are 1 best solutions below

0
On

Unfortunately, I don't think the rectangles you are using are compatible with the image you're using. If you look at the CIDetector Class Reference it reads:

CIDetectorTypeRectangle A detector that searches for rectangular areas in a still image or video, returning CIRectangleFeature objects that provide information about detected regions.

The rectangle detector finds areas that are likely to represent rectangular objects that appear in perspective in the image, such as papers or books seen on a desktop.

Taking this advice I ran your code on this image:enter image description here

With the results:

bounds = "(296.752716064453, 74.7079086303711, 181.281158447266, 253.205474853516)\n"

I ran your code on other plan rectangles with no features found. Needs to be somewhat of a "real-world" image.