I actually test the casting behavior in C# in unchecked context. Like the documentation said, in unchecked context, the cast always succeed. But sometimes, in particular cases, the cast from one specific type to another type give unexpected result.
For example, i tested three "double to sbyte" casts :
var firstCast = (sbyte) -129.83297462979882752; // Result : 127.
var secondCast = (sbyte) -65324678217.74282742874973267; // Result : 0.
var thirdCast = (sbyte) -65324678216.74282742874973267; // Result : 0.
Just to be clear, the difference between the second and the third double is just 1 (secondDouble - firstDouble = 1).
In this case, results of casting seem to always be 0 for any "big" double value.
My question is : why the second and the third casts result in 0 ?
I searched for an answer in the C# documentation, but i did not find any.
I tested the above with the .Net Framework 4.7.2.
According to the C# language specification,
Without using the
checkedoruncheckedoperators, by default the overflow checking context is unchecked, so we look at:Here, the values are neither NaN nor infinite. When rounded towards zero, they are not in the valid range of
sbyte, which is -128 to 127, therefore the last bullet point applies, which means that the result of such a cast is unspecified.In other words, the result of this cast depends on which compiler you are using. Different compilers could do different things, and they will still be called C# compilers. It is likely that whatever compiler that you are using just thought it'd be a better idea to return 0 for the conversion when the value to convert is very far away the lower/upper bound.