I have been working on a photo editor app for iOS using cifilter framework and GPU image framework, it takes a lot of time when applying filters on high-resolution images.
In order to decrease the processing time, I implemented the filtering as well as editing feature by reducing the original size of them. Thus, as obvious, it produces a low-resolution image as an output.
Now I am struggling to generate high-resolution image in the output. Therefore, it would be a great help for me if anyone helps me by providing ideas or probable solutions to decrease the processing time or a way to upscale image resolution to the original resolution.
In our apps, we use different resolutions for editing and exporting. For editing, the rendering needs to be fast and snappy, but for export, depending on the user-chosen export resolution, processing might take some time.
We reduce the export time for older devices by processing on a smaller resolution internally (but still much higher than preview resolution) and upsampling the image afterward.
For upsampling, you can use a joint bilinear upsampling technique, which uses the original image to scale up the smaller, filtered image with very high quality. Apple implemented this technique in the
CIEdgePreserveUpsampleFilter.