I've made a robot controlled by Arduino and Processing, that moves in a room by rotating itself (like a sphere).
What I need is to be able to get the new location once it moves on the floor (let's say within a 3m x 3m room). I'm using a 9DOF sensor (3 axes of accelerometer data, 3 axes gyroscopic, and 3 axes of magnetic data) to determine its roll, pitch and yaw and also its direction.
How is it possible to identify accurately the location of the robot in Cartesian (x,y,z) coordinates relative to its starting position? I cannot use a GPS since the movement is less that 20cm per rotation and the robot will be used indoors.
I found some indoor ranging and 3D positioning solutions like pozyx or by using a fixed camera. However I need it to be cost efficient.
Is there any way to convert the 9DOF data to get the new location or any other sensor to do that? Any other solution such as an algorithm?
As one points out in the comments, integrating acceleration gives velocity, and integrating this again gives position. This is however not very accurate as errors will accumulate in no time.
Instead what people are using is to use "sensor fusion", which combines the data of several sensors into a better estimate of e.g. the position. It will however still accumulate error over time if you rely on the accelerometer and gyro alone. The magnetic vector will however help you, but it will probably still be inaccurate.
I have found the following guide online that gives an introduction to sensor fusion with kalmann filters on an arduino.
http://digitalcommons.calpoly.edu/cgi/viewcontent.cgi?article=1114&context=aerosp
Warning: you need to know some math to get this up and running.