Subject: Python using compass sensor input: Stuck on calculation
I’m taking a compass reading in degrees named “start”, turning about 360 degrees and taking a reading “end.” I want to know delta between start and end. Not how many degrees turned, but how different end is from start.
degrees_start = 0
degrees_end = 359
#degrees_diff = degrees_end - degrees_start
degrees_diff = (degrees_start-degrees_end) % 360
print(degrees_diff)
'''
test set
degrees_start, degrees_end, deg_diff(expected), deg_diff(observed)
10, 20, +10, +10
20, 10, -10, -10
350, 10, +20, -340
10, 350, -20, +340
0, 359, -1, +359
359, 0, +1, -359
'''
Algorithm is pretty easy: end – start = delta
But I’m stuck at the boundary at 359 and 1. Example start 10, end 350. Or start 350 end 10. I’ve tried many arithmetic combinations but not come up with a formula that is always correct.
Any suggestions? Thanks.
Tests of some answers below:
# test 10,350 -> correct answer -> -20 i.e. 20 deg short of full circle
#degrees_diff = degrees_end - degrees_start # test 10,350 -> 340
#degrees_diff = (degrees_start-degrees_end) % 360 # test 10,350 -> 20
#degrees_diff = (degrees_end - degrees_start) % 360 # test 10,350 -> 340
You must use the modulo operator (
%in Python). The algo becomesdelta = (end - start) % 360.In Python the result is guaranteed to be in the semi-open interval [0,360). If you prefer [-180, 180), you could use: