Download Custom CoreML Model and Load for Usage [Swift]

2.6k Views Asked by At

I am creating an app based on Neural Network and the CoreML model size is of around 150MB. So, it's obvious that I can't ship it within the app.

To overcome this issue, I came to know about this article, mentioning that you can download and compile the CoreML model on device.

I did and I download on my device, but the problem is I cannot do the predictions as the original model. Like, the original model is taking UIImage as an input but the MLModel is MLFeatureProvider, can anyone address how can I do the type casting to my model and use it as original?

do {
    let compiledUrl = try MLModel.compileModel(at: modelUrl)
    let model = try MLModel(contentsOf: compiledUrl)
    debugPrint("Model compiled \(model.modelDescription)")
    //model.prediction(from: MLFeatureProvider) //Problem
    //It should be like this
    //guard let prediction = try? model.prediction(image: pixelBuffer!) else {
    //    return
    //}
} catch {
    debugPrint("Error while compiling \(error.localizedDescription)")
}
3

There are 3 best solutions below

1
On
    let url = try! MLModel.compileModel(at: URL(fileURLWithPath: model))
    visionModel = try! VNCoreMLModel(for: MLModel(contentsOf: url))
0
On

When you add an mlmodel file to your project, Xcode automatically generates a source file for you. That's why you were able to write model.prediction(image: ...) before.

If you compile your mlmodel at runtime then you don't have that special source file and you need to call the MLModel API yourself.

The easiest solution here is to add the mlmodel file to your project, copy-paste the automatically generated source file into a new source file, and use that with the mlmodel you compile at runtime. (After you've copied the generated source, you can remove the mlmodel again from your Xcode project.)

Also, if your model is 150MB, you may want to consider making a small version of it by choosing an architecture that is more suitable for mobile. (Not VGG16, which it seems you're currently using.)

0
On
    guard let raterOutput = try? regressionModel.prediction(from: RegressorFeatureProviderInput(
        feature1: 3.4,
        feature2: 4.5))
        else {return 0}
    return Double(truncating: NSNumber(value:RegressorFeatureProviderOutput.init(features: raterOutput).isSaved))

Adding to what @Matthjis Hollemans said