CoreML converted model won't output array

82 Views Asked by At

I am new to CoreML and have a simple problem but have looked everywhere and I cannot seem to find the answer. The simplest version of my problem is this: I have a model in PyTorch which should output an array (I don't want softmax, I want to do multilabel classification and output separate probabilities for each class). I want to convert this model to an MLModel using coremltools and use it in a Swift app.

Here is a dummy model and its conversion to MLModel in Python:

import coremltools as ct
import torch
from torchinfo import summary

# From pytorch: https://pytorch.org/tutorials/beginner/introyt/modelsyt_tutorial.html 
class TinyModel(torch.nn.Module):

    def __init__(self):
        super(TinyModel, self).__init__()

        self.linear1 = torch.nn.Linear(100, 200)
        self.activation = torch.nn.ReLU()
        self.linear2 = torch.nn.Linear(200, 100)

    def forward(self, x):
        x = self.linear1(x)
        x = self.activation(x)
        x = self.linear2(x)
        return x

def main():
    guitar_net = TinyModel()
    guitar_net.eval()

    # Trace the model with random data.
    example_input = torch.rand(100)

    summary(guitar_net, (100,))

    traced_model = torch.jit.trace(guitar_net, example_input)
    out = traced_model(example_input)

    model = ct.convert(
        model=traced_model,
        inputs=[ct.TensorType(shape=example_input.shape)],
    )

    model.save("TorchModel.mlmodel")


if __name__ == '__main__':
    main()

The above model should (and does in Python) output a tensor of length 100. But when I take the converted model to Xcode, Xcode says the output is just a single Float32:

enter image description here

In spite of this, I can run predictions on the model and even access the output as if it were an array:

import SwiftUI
import CoreML
import UIKit

struct ContentView: View {
        
    var torchModel: TorchModel?
    
    init() {
        do {
            torchModel = try TorchModel(configuration: MLModelConfiguration())
        } catch {
            torchModel = nil
            print(error)
        }
    }

    var body: some View {
        Text("Hello, Stack Overflow!")
        .task {
            let buffer = [Float32](repeating: 0, count: 100)
            let shape: [NSNumber] = [100]
            
            guard let mlMultiArray = try? MLMultiArray(shape:shape, dataType:MLMultiArrayDataType.float32) else {
                fatalError("Unexpected runtime error. MLMultiArray")
            }
            for (index, element) in buffer.enumerated() {
                mlMultiArray[index] = NSNumber(floatLiteral: Double(element))
            }
            let input = TorchModelInput(x: mlMultiArray)
            
            guard let predictionOutput = try? torchModel?.prediction(input: input).var_10 else {
                fatalError("Unexpected runtime error. model.prediction")
            }
        }
    }
}

The issue is that occasionally I get a memory address issue when using this code (EXC_BAD_ACCESS (code=1, address=0x13fcffff0)). With the AddressSanitizer turned on, it crashes every time, presumably because the model expects to output a single Float32 and I am treating the MLMultiArray like a length-100 array. How can I convert the model properly so that the expected output is an array of length 100?

0

There are 0 best solutions below