GPUImageFilter vs CIFilter (memory issue with GPUImage)

948 Views Asked by At

Is there any memory issue with GPUImage? Below there are two different code for Vignette filter effect. First one (Apple's CI filter) uses 19 MB memory whereas GPUImage uses more than 75 MB. What's wrong with my code?

Vignette Filter with CIFilter

CIImage *ciImage = [[CIImage alloc] initWithImage:image];
CIFilter *filter = [CIFilter filterWithName:@"CIVignetteEffect" keysAndValues:kCIInputImageKey, ciImage, nil];

CIContext *context = [CIContext contextWithOptions:nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgImage = [context createCGImage:outputImage fromRect:[outputImage extent]];

UIImage *result = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);

return result;

And here is the GPUImage version:

GPUImageFilter * f = [[GPUImageVignetteFilter alloc] init];
UIImage *result = [f imageByFilteringImage:image];

return result;

I'm using "ARC" in my project. Do you have any idea? What should I do to be able to "release" GPUImage filter memory?

1

There are 1 best solutions below

1
On

For the GPUImage approach in the above code, there will briefly be four copies of an image in memory: the original UIImage and its bytes, the texture that this image is uploaded to on the GPU, the result of the filtering operation, and the UIImage and its bytes that are created as a result of that operation.

The convenience methods used above might be simple, but they're not the best at reducing memory pressure. Instead, I'd try something like the following:

GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:image smoothlyScaleOutput:NO];
GPUImageVignetteFilter *vignetteFilter = [[GPUImageVignetteFilter alloc] init];
[stillImageSource addTarget:vignetteFilter];
[stillImageSource prepareForImageCapture];
[stillImageSource processImage];
UIImage *result = [vignetteFilter imageFromCurrentlyProcessedOutput];

Depending on how your original UIImage was created, you might want to wrap the first line there in an @autoreleasepool, along with the creation of the original UIImage. The original UIImage is no longer needed after the first line, so this should at most have two images in memory at any given time.

The -prepareForImageCapture bit is an optimization that causes the UIImage returned from this to share a memory mapping with the output texture from the filter. This removes the need for one of the image copies at the end, but it does lock the filter to the UIImage and can cause odd behavior if you try to reuse that filter or UIImage. I'm also not entirely sure that all memory is freed when a filter is deallocated before its memory-mapped UIImage is, so you might be safer in encoding that image as a JPEG or PNG at the end of that method and only passing back the NSData representation of that.