I am using the following to convert images to grayscale before showing them on a UITableView using UIImageView:
extension UIImage {
var noir: UIImage? {
let contextForGrayscale = CIContext(options: nil)
guard let currentFilter = CIFilter(name: "CIPhotoEffectNoir") else { return nil }
currentFilter.setValue(CIImage(image: self), forKey: kCIInputImageKey)
if let output = currentFilter.outputImage,
let cgImage = contextForGrayscale.createCGImage(output, from: output.extent) {
return UIImage(cgImage: cgImage, scale: scale, orientation: imageOrientation)
}
return nil
}
}
Since I am showing these images in a UITableView using UIImageView, each image is being grayscaled as the user scrolls. On my iPhone 13, the performance seems very good and I don't see any lag. However, I am curious how good its performance is on an old device. I don't have an old device, so I am unable to test it.
Is this a performant way to grayscale on the fly and display them? Is there anything I can do to make it better?
Is there a way to make my phone slower for testing the performance? Sort of like simulate older device?
If performance / memory pressure doesn't seem to be an issue, I'd just not worry about it. If it is a problem you could use
NSCache.I'd do the caching outside the extension, but for the sake of the code example:
Also, check out this article https://nshipster.com/image-resizing/ You could, in addition to creating this new image, also create a thumbnail right sized for its display image view and use the built in caching mechanisms for this. This would save some memory and performance overall. But again, if it's not an issue, I'd be happier to just have the simpler code and no caching!
Oh, one more thing. You could use this https://developer.apple.com/documentation/uikit/uitableviewdatasourceprefetching to create the image ahead of time async before the cell is displayed, so it's ready to go by the time the table asks for the cell for the given index path. Thinking about it this is probably the simplest / nicest solution here.