I've been using (ulong) 1 to represent 1bit in 64bit-data type. I wonder if (ulong) 1 does some sort of type conversion and takes more time than 1ul.
Just being curious if this makes any difference. I thought both of them are just exactly same in performance. Is this correct, or is (ulong) 1 actually slower?
No, no runtime type conversion will happen when you use casting with compile time constant and build-in numeric conversion (i.e.
(ulong) 1) since compiler can handle such things and both approaches should result in the same IL/ASM. For example:Will produce the same IL:
From the language specification - 12.23 Constant expressions:
So basically in this case
(ulong) 1will be treated as constant as the integer literal1ul.Also see the Integer literals section of the docs.