I am creating an app based on Neural Network and the CoreML
model size is of around 150MB. So, it's obvious that I can't ship it within the app.
To overcome this issue, I came to know about this article, mentioning that you can download and compile the CoreML
model on device.
I did and I download on my device, but the problem is I cannot do the predictions as the original model. Like, the original model is taking UIImage
as an input but the MLModel
is MLFeatureProvider
, can anyone address how can I do the type casting to my model and use it as original?
do {
let compiledUrl = try MLModel.compileModel(at: modelUrl)
let model = try MLModel(contentsOf: compiledUrl)
debugPrint("Model compiled \(model.modelDescription)")
//model.prediction(from: MLFeatureProvider) //Problem
//It should be like this
//guard let prediction = try? model.prediction(image: pixelBuffer!) else {
// return
//}
} catch {
debugPrint("Error while compiling \(error.localizedDescription)")
}