What sort of fuzzy flood fill algorithms are there?

1.2k Views Asked by At

I'm implementing a flood fill algorithm for the project I'm currently working on. I'm using it for the normal purpose, image editing. I have no problem with the basic algorithm, but I'd like a better looking fill.

In many cases areas of my image will have regions that are mostly one color, but which are bordered by pixels which are slightly lighter or darker. I'd like to know an algorithm for a "fuzzy" flood fill that won't leave these border pixels. I've tried to fill all pixels withn two different, simple, distance metrics of the origin pixel:

  1. a Manhattan Distance on all 3 color components: red, green, and blue
  2. the maximum of the distance between the color components.

Neither of these do the trick, often leaving borders and occasionally filling adjacent regions of a visually distinct but "close" color.

I don't think there is a magic bullet to solve my problem, but I'd be interested in knowing any algorithms which I might try to get a better result, or even where I might usefully look to find such algorithms. Looking around the net I've found reference to something called the "fuzzy flood fill mean shift algorihm", but I'm not sure that's even the same thing.

3

There are 3 best solutions below

0
On

Using the actual distance seems natural: D = Sqrt(R^2 + G^2 + B^2)

Then define a tolerance parameter that specifies the maximum distance from the original pixel (in colorspace) that the test pixel can be. If it is greater than that value, do not flood outward from that pixel.

Tweak the tolerance from 0 to Sqrt(255^2 + 255^2 + 255^2) until you see the desired effect.

0
On

Maybe you could try using the qualities of the local pixels rather than the origin pixels. You could do an effect much like an anisotropic diffusion filter. Enqueue a neighbor if the gradient between the current pixel (in the fill) and the neighbor pixel is low enough.

0
On

You should set tolerance not with a single number but rather with a range. Say, from 20% to 50% would mean that when color difference is 20% you change the color of this pixel completely. When it's greater then 50% you don't fill this pixel. And when the difference lays in a range from 20% to 50% you blend the old color with the new color with ratio of (d-t_min)/(t_max-t_min) where d is color difference and t_max and t_min is your tolerance range (expressed in 0...1). I never seen such an algorithm ever implemented; maybe I just invented it.