contrast enhancement how linearly stretch the grey levels of an image?

1.1k Views Asked by At

a screenshot of the img values2[this is the original]3[this is the expected output]this is the output I getI'm trying to stretch the grey levels from 0-100 to 50-200 in python but the output image is not right. I drew the straight line representing the linear relationship between the two ranges, and in line 8 I'm using this equation to get the output. What's wrong with my code?

This is my first question, so sorry for mistakes.

def Contrast_enhancement(img):
    newimg = img
    height = img.shape[0]
    width = img.shape[1]
    for i in range(height):
       for j in range(width):
           if(img[i][j] * 255 >= 0 and img[i][j] * 255 <= 100):
               newimg[i][j] = (((3/2) * (img[i][j] * 255)) + 50)/255
    return newimg
2

There are 2 best solutions below

0
On
import numpy as np
import copy
def Contrast_enhancement(img):
    newimg = np.array(copy.deepcopy(img)) #this makes a real copy of img, if you dont, any change to img will change newimg too
    temp_img=np.array(copy.deepcopy(img))*3/2+50/255
    newimg = np.where(newimg<=100,temp_img,newimg)
    return newimg

or shorter:

import numpy as np
import copy
def Contrast_enhancement(img):
    newimg = np.array(copy.deepcopy(img)) #this makes a real copy of img, if you dont, any change to img will change newimg too

    newimg = np.where(newimg<=100,newimg*3/2+50/255,newimg)
    return newimg

The copy part should be solving your problem and the numpy part is just to speed things up. Np.where returns temp_img if newimg is <=100 and newimg if not.

0
On

There are two answers to your question:

  • The one is strictly technical (the one that @DonQuiKong tries to answer) referring to how to do the stretching you refer to simpler or correctly.
  • The other one is implicit and tries to answer you actual problem of image stretching.

I am focusing on the second case here. Judging from the image sample you provided you are not taking the correct approach. Let's consider the samples you provided indeed have all intensity values between 0-100 (from screen capturing in my pc they don't but that's screen dependent to a degree). Your method seems correct and should work with minor bugs.

1) A minor bug for example is that:

newimg = img

does not do what you think it does. It does creates an alias of the original variable. Use:

newimg = img.copy()

instead.

2) If an image with different boundaries come to you your code is broken. It will ignore some pixels for some reason and that is not you wanted I guess.

3) The stretching you want can be applied to the whole image in that case using something like:

newimg -= np.min(newimg)
newimg /= np.max(newimg)

which just stretches your intensities to the 0-255 boundary.

4) Judging from your sample images also you need a more radical stretching (which will sacrifice a bit of image information to increase image contrast). Instead of the above you can use a lower limit:

    newimg -= np.min(newimg)
    newimg /= (np.max(newimg) * 0.5)

This effectively "burns" some pixels but in your case the result looks more close to your desired one. Apart from that you can apply a non linear mapping (a logarithmic one for example) of old intensities to new ones and you won't get any "burned" pixels.

A sample with value 0.5:
enter image description here