I'm implementing a simple Android app, in which I need to identify the north.
So I've implemented SensorEventListener
and I used something like this:
@Override
public void onSensorChanged(SensorEvent event) {
if(event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
SensorManager.getRotationMatrixFromVector(mRotationMatrix, event.values);
SensorManager.remapCoordinateSystem(mRotationMatrix, SensorManager.AXIS_X, SensorManager.AXIS_Z, orientationVals);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: "+ orientationVals[0] +"\n Pitch: "+ orientationVals[1]+"\n Roll: "+ orientationVals[2]);
}
}
The problem is that the code line
SensorManager.remapCoordinateSystem(mRotationMatrix, SensorManager.AXIS_X, SensorManager.AXIS_Z, orientationVals);
seems to not work, since in both case (with or without this code line) the value of Yaw (azimuth) depends on the orientation of the head of the phone (letting it layied on the back side)
what I expected, using my remapping, was that the Yaw changed based on the orientation of the back of the phone (the rear camera).
Why my remapping does not work?
I used this:
Please tell me if it works because I still have Problems with that solution. It's my own code from here. I wrote it based on this official post.