Either the type of first_expression and second_expression must be the same, or an implicit conversion must exist from one type to the other.
If the literal has no suffix, it has the first of these types in which its value can be represented: int, uint, long, ulong.
Consider:
var value = test ? (Int64)1 : 0;
0, a decimal-digit literal with no suffix will be transformed into an int
. int
can be implicitly converted into an Int64
. Since this conversion only occurs in one direction, we can feel safe that the return value will be an Int64.
However:
var value = test ? (UInt64)1 : 0;
UInt64
and int
can not be implicitly converted into each other, but yet this code compiles and runs and the resulting type is UInt64
.
At what point is the type of 0
determined?
If the two types are implicitly castable to each other, then which of the two types will you end up with? (I don't think this happens normally, but user generated classes could implement such casting.)
Prior Research:
I found several other questions with similar titles, but they were all related to null or nullable types.
Relevance: This matters in my code because we're immediately passing this result off to ByteWriter.Write, and want to end up with the correct overload that writes the correct number of bytes. The examples are of course grossly simplified.
Alternate syntax makes the result explicit may be the best option for clarity no matter what is actually happening without the explicit cast:
var value = test ? (UInt64)1 : (UInt64)0;
Be aware that there is one set of implicit conversions between integer types when the numbers are compile-time constants (literals), and another set of conversions when they are not constants.
Your interesting example is:
which could also be written:
where the
ul
suffix meansulong
, i.e.System.UInt64
.When literals (constants) are used there does exist an implicit conversion from
int
(System.Int32
) toulong
, but only when thatint
constant is non-negative. It's really the same as:It works, as I said, because of the implicit constant conversion from
int
(since the compiler knowsb
is not negative) toulong
.Now, take away the
const
to get:We see that with non-constants, no implicit conversion in either direction exists, so there's no best common type for
a
andb
and this fails.