I feel really stupid about this but I'm having some problems with calculating the change in % when working with negative numbers.
The calculation I'm using gives a satisfying result when the numbers are > 0.
decimal rateOfChange = (newNumber - oldNumber) / Math.Abs(oldNumber);
Lets say I have two numbers 0.476(newNumber) and -0.016(oldNumber) that's an increase of 0.492 and with my calculation the rate of change is 3 075%.
If i instead have 0.476(newNumber) and 0.001(oldNumber) that's an increase of 0.475 and my calculation will give me the rate of change of 47 500% which seems correct.
Blue line represent example one and red line is example two. In my world blue line should have bigger % change.
How would I write this calculation to give me the correct % change when also dealing with negative numbers? I want it to handle both increases and decreases in the % change.
I understand this is a math issue and I need to work on my math
You are mixing two concepts: absolute and relative deviation.
You seem to expect that bigger absolute deviations imply bigger relative deviations, which is false. You also seem to think that negative numbers is the cause of the unexpected (but correct) results you are getting.
Relative deviation depends on the magnitude of the absolute deviation and the magnitude of your reference value, not its sign. You can have smaller absolute deviations that imply really big relative deviations and vice versa.