I need to tap on an UIImageView
...
... and compare the color of a tap location with a color returned from an assets catalog (or a custom color created in the code).
I am having lots of trouble with color spaces and my matches always being no. I have read some great examples on stackoverflow and have tried them, but I must still be doing something wrong.
I have also tried to use custom colors like this (instead of asset colors):
+ (UIColor *)themeRed {
return [UIColor colorWithRed:192.0f/255.0f green:92.0f/255.0f blue:42.0f/255.0f alpha:1];
}
Still no matches. My code for testing matches follows:
-(void)tappedColorView:(UITapGestureRecognizer *)tapRecognizer {
CGPoint touchPoint = [tapRecognizer locationInView: uiiv_hs];
//NSLog(@"my color %@", [uiiv_hs colorOfPoint:touchPoint]);
UIColor *color = [uiiv_hs colorOfPoint:touchPoint];
NSLog(@"color %@",[uiiv_hs colorOfPoint:touchPoint]);
UIColor *matchcolor = [UIColor themeRed];
NSLog(@"mcolor %@",[UIColor themeRed]);
NSArray *colors = [NSArray arrayWithObjects:[UIColor colorNamed:@"Color01"],[UIColor colorNamed:@"Color02"], nil];
if ([color matchesColor:matchcolor error:nil]) {
NSLog(@"1Match!");
} else {
NSLog(@"1No Match!");
}
if ([color isEqualToColor:[UIColor themeRed]]) {
NSLog(@"2Match!");
} else {
NSLog(@"2No Match!");
}
}
Don't do this if you're not familiar with the following topics. I'm going to show you one way, a very simplified one, but there will be gotchas.
Resources
Something to read upfront or at least be aware/familiar with it.
Problem #1 - Which color do you want?
Your
UIImageView
can be fully opaque, transparent, partially opaque, transparent, ... Let's say that there's a view with yellow color below theUIImageView
and theUIImageView
isn't opaque and alpha is set to 50%. What color do you want? Original image color? Rendered color (mixed with the yellow one)?I assume the one mixed with the yellow. Here's the code to get the right color.
Objective-C:
Swift:
This function should return a color with alpha component set to
1.0
. It has to be1.0
otherwise we can't continue. Why? Imagine that theUIImageView
alpha is set to 50% -> we wont traverse the view hierarchy (thistargetOpaqueView
dance) -> we will get a color with alpha component close to0.5
-> not of any use, what's below? White? Black? Orange?Problem #2 - Device
Every device is different and can display different range of colors - iPads, iPhones, ... This also includes other device types like computer displays, printers, ... Because I don't know what are you trying to achieve in your application, take it as a reminder - same color, but different look on every device.
Problem #3 - Color profile
Here's the comparison of the Display P3 profile and Generic sRGB. You can compare more profiles just by launching ColorSync Utility on your macOS. It demonstrates that the color will differ when you convert a color from one space to another one.
Problem #4 - Color conversion
Quotes from the Color transformation chapter:
This is sort of complicated process, but it depends on many things (color spaces, ...). It can involve posterization (16-bit -> 8-bit for example), it depends on your rendering intent (relative colorimetric, perceptual, ...), etc.
Very simplified example - you have an image with red color (
#FF0000
) and there's Generic sRGB profile assigned to it. iPhone is going to display it (P3) and you'll see different color. More precisely, if you get RGB component values, they'll differ.CoreGraphics.framework provides color conversion function -
converted(to:intent:options:)
. You have to pass:nil
).Objective-C:
Swift:
An example:
#CC3333
): 0.8 0.2 0.2 1Problem #5 - Custom color
You can create colors to compare against in two ways:
UIColor
initializerXcode assets catalog allows you to specify color for different devices, gamuts or you can select custom content (Display P3, sRGB, Extended Range sRGB, ...).
UIColor
initializer allows you to specify color in (not just):Be aware how do you create a color to compare against. RGB component values differ in various color spaces.
Problem #6 - Color comparison
As you are aware of how it works now, you can see that there's some math involved in a way that the converted color won't precisely match. I mean - two colors, converted to the same color space - you can't still compare components (RGB) with simple equality operator. Precision is lost due to conversion and even if not - remember What every computer scientist should know about floating-point arithmetic?
There's a way how to achieve what you want and it's called Color difference. In other words - you'd like to calculate two colors distance.
Objective-C:
Swift:
We're able to calculate Euclidean distance of two colors, let's write a simple function to check if two colors matches with some tolerance:
Objective-C:
Swift:
All the pieces together
#C02A2B
) from the imageCustomRedColor
into the assets catalog#C02A2B
Objective-C:
Swift:
Tapped color matches CustomRedColor: true
Conclusion
It's not that easy to get colors right (define, get, compare, convert, ...). Tried to point out important things while keeping it simple, but you should be able to do it now. Don't forget to correctly create a color, convert to a proper color space, pick a tolerance which suits your application needs, etc.
Here's the public GitHub Gist which contains both Objective-C & Swift view controller implementations.
Addendum
The color space conversion is not necessary, because the
[UIColor getRed:green:blue:alpha:]
documentation says:Color component values should be converted for you if you use this function. But I kept the conversion code in this answer to demonstrate what's going on under the hood.