Reconstruction of original image using the Laplacian Filter output

771 Views Asked by At

I have applied Laplacian filter to the image for detecting the edges in the image.

import matplotlib.pyplot as plt
from skimage import filters

output= filters.laplace(image)
plt.imshow(output, cmap = 'gray')
plt.title('Laplace', size=20)
plt.show()

I need to reconstruct the original image using the output obtained from the code above.

I'm not sure if 'filters.inverse' works or if there is any other method available.

1

There are 1 best solutions below

1
On

What you are looking for is called deconvolution. If you search for "scikit-image deconvolution", you will probably land at the documentation for the Richardson-Lucy deconvolution function, or at this example usage. Note: it is not always theoretically possible to reconstruct the original signal (it's a little bit like unmixing paint), but you can get reasonable approximations, especially if your convolution is exactly known.

You can look at the source code for the Laplace filter, where you see that the image is convolved with a laplacian kernel. That is the kernel we need to deconvolve the image. (Note that you can always regenerate the kernel by convolving an image containing just a 1 in the center and 0s everywhere else. That's why the kernel in deconvolution is referred to as the point-spread function.)

So, to restore your image:

from skimage.restoration.uft import laplacian
from skimage.restoration import richardson_lucy

kernel_size = 3  # default for filters.laplace, increase if needed
kernel = laplacian(output.ndim, (kernel_size,) * output.ndim)
restored = richardson_lucy(output, kernel)