How to generically track relative position, velocity and orientation from IMUs?

617 Views Asked by At

For a light painting project (could be s.th. else, too) I'd like to calculate relative position, orientation and velocity from 6-axis IMUs (I have a MPU6886 connected to an ESP32, but I hope, that doesn't matter). Of course there are plenty of algorithms out there. Some are purely mathematical, some are implemented in specific languages. Most of them calculate only position or orientation, most of them are quiet old first-shot answers to very basic questions. I think I know the basic math but I don't want to re-invent the wheel and I'd like to approach a reusable solution targeting real-world problems like:

  • improving precision by combining sensor data
  • jittering
  • filtering
  • different sensor types
  • auto-calibration
  • working with/without extra sensor data (e.g. compass)

I'd like to keep the approach language agnostic, i.e. I want to start over with a Python (MicroPython) implementation but keep the option to migrate to a native (C++/Rust) library with bindings to other languages.

So to turn this fuzzy intend into a Stackoverflow question: what's the sophisticated way to turn a stream of acceleration, rotation, time data points into a stream of relative position, velocity, orientation data points (using Python, if that matters)? What do I have to keep in mind? What existing open source projects with generic positioning are out there? The accepted answer might also just clarify the terminology and provide links to good (up to date) articles on this topic.

1

There are 1 best solutions below

0
On

I am currently working on something similiar. Check out https://www.samba.org/tridge/UAV/madgwick_internal_report.pdf for a thorough explanation of the tompic. For position, you'd have to integrate velocity over time (although numerical errors are probably going to be terrible, I am not sure there is a solution without fusing your sensor with GPS).