I have the following code to get the pixel data from a UIImage, this works for most images however does not work when I create an image using the UIGraphicsImageRenderer. I was hoping someone knew a solution to this.
My current code generates a simple image but then accessing the data gives unexpected results.
func myDraw() {
let renderer = UIGraphicsImageRenderer(size: CGSize(width: 200, height: 200))
let image = renderer.image { context in
context.cgContext.setFillColor(UIColor.black.cgColor)
context.cgContext.addRect(CGRect(x: 0, y: 0, width: 100, height: 100))
context.cgContext.fillPath()
context.cgContext.setFillColor(UIColor.red.cgColor)
context.cgContext.addRect(CGRect(x: 100, y: 100, width: 100, height: 100))
context.cgContext.fillPath()
}
let providerData = image.cgImage!.dataProvider!.data
let data = CFDataGetBytePtr(providerData)!
var pixels = [PixelData]()
for i in stride(from: 0, to: 160000-1, by: 4) {
pixels.append(PixelData(a:data[i+3], r:data[i+0], g:data[i+1], b:data[i+2]))
}
self.canvas.image = self.imageFromARGB32Bitmap(pixels: pixels, width: 200, height: 200)
}
I have used the following code to generate the image to see if it was working correctly.
func imageFromARGB32Bitmap(pixels: [PixelData], width: Int, height: Int) -> UIImage? {
guard width > 0 && height > 0 else { return nil }
guard pixels.count == width * height else { return nil }
let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedFirst.rawValue)
let bitsPerComponent = 8
let bitsPerPixel = 32
var data = pixels // Copy to mutable []
guard let providerRef = CGDataProvider(data: NSData(bytes: &data,
length: data.count * MemoryLayout<PixelData>.size)
)
else { return nil }
guard let cgim = CGImage(
width: width,
height: height,
bitsPerComponent: bitsPerComponent,
bitsPerPixel: bitsPerPixel,
bytesPerRow: width * MemoryLayout<PixelData>.size,
space: rgbColorSpace,
bitmapInfo: bitmapInfo,
provider: providerRef,
decode: nil,
shouldInterpolate: true,
intent: .defaultIntent
)
else { return nil }
return UIImage(cgImage: cgim)
}
A few observations:
Your code assumes that
UIGraphicsImageRenderergenerates images with scale of 1, whereas it defaults to 0 (i.e. whatever scale your device uses).Instead, force the scale to 1:
It’s not the issue here, but we must note that your code just assumes that the format of
UIGraphicsImageRendererFormatwill be a particular byte order and format, as does yourimageFromARGB32Bitmap. If you look at Apple Technical Note 1509 (from which your code was undoubtedly originally adapted), they don’t just assume that the buffer will be in a particular format. When we want to manipulate/examine a buffer, we should (a) create a context of the desired format, (b) draw our image (or whatever) to that context, and only then can we reliably look at the provider data.The
imageFromARGB32Bitmapworks, but it makes me a bit nervous.The use of
MemoryLayout<PixelData>.size: Apple advises :So, I’d use
stride.What if
stridewasn’t 4 like you expect it to be? I can’t imagine it would ever not be 4, but with the data provider assumes that they will be packed in. It’s a minor observation, but I might make this assumption explicit.Are we 100% assured that dereferencing
&datawill give us a contiguous buffer? I’d lean towardswithContiguousStorageIfAvailablejust to be safe.For example: