I know (-0 === 0) comes out to be true. I am curious to know why -0 < 0 happens?
When I run this code in stackoverflow execution context, it returns 0.
const arr = [+0, 0, -0];
console.log(Math.min(...arr));
But when I run the same code in the browser console, it returns -0. Why is that? I have tried to search it on google but didn't find anything useful. This question might not add value to someone practical example, I wanted to understand how does JS calculates it.
const arr = [+0, 0, -0];
console.log(Math.min(...arr)); // -0
-0is not less than0or+0, both-0 < 0and-0 < +0returnsFalse, you're mixing the behavior ofMath.minwith the comparison of-0with0/+0.The specification of
Math.minis clear on this point:Without this exception, the behavior of
Math.minandMath.maxwould depend on the order of arguments, which can be considered an odd behavior — you probably wantMath.min(x, y)to always equalMath.min(y, x)— so that might be one possible justification.Note: This exception was already present in the 1997 specification for
Math.min(x, y), so that's not something that was added later on.