Convert azure kinect JSON output to BVH

767 Views Asked by At

My current process goes like this

I used the k4arecorder that comes with the Azure-Kinect SDK v1.41 to record two MKV videos.

In the first, the person is in a T pose and in the second, they are doing some motion.

I then used the offline_processor from Microsoft/Azure-Kinect-Samples/body-tracking-samples to convert these two videos into JSON objects. For each frame the JSON contains the x, y, z positions where z is relative to the camera position and y+ is pointing downwards as well as the quaternion orientations for each joint

For the T-Pose json object, I extracted 1 frame of positions and rotations where the T-Pose was perfect. I then parsed this JSON object into two pandas dataframes of positions and orientations. The orientations were converted into euler angles.

For the second 'motion' Json object, I parsed that into two other pandas dataframes. In the first, each frame is a row and the columns are in the form

             joint_1.x, joint_1.y, joint_1.z       ...      joint_n.x, joint_n.y, joint_n.z 

In the orientation matrix, each row is also a frame and the columns are in the format

             joint_1.z, joint_1.y, joint_1.x       ...      joint_n.z, joint_n.y, joint_n.x 

What I want to know is this:

How can I go from these 4 matrices where all of the coordinates are in global space to a BVH file. I've tried a number of solutions but all have failed.

I'm missing some fundamental logic in this process and if anybody can help, I would really appreciate it. Any code solutions in any language are also appreciated

2

There are 2 best solutions below

2
On

I've had some success with this.

You have to put the global orientations of the joints in the local orientation of their parent. But what does that actually mean?

Before you convert to the final euler, you should first convert to rotational matrices. The best tool I discovered for this is scipy.spatial.transform.Rotation.

First find the parent matrix, then its inverse and then perform a dot product between that and the child joint matrix

This code will find the local orientation of a joint in relation to it's parent

This will give you the local orientation of the child joint. However, you're not out of the woods yet. This approach works for the legs, spine and hips but results in really messed up arms

enter image description here

I suspect its related to the Azure Kinect's joint tracking orientation. See how the axes shift when they get to the arms

enter image description here

I don't really know enough about rotations to solve this final issue. Everything gets messed up from the clavicles onwards to the hands. Any suggestions would be great

1
On

Hands are very complicated because the hierarchy is pretty long. Use the normal rotations from spine chest to transform arms motion then use zyx as the order of rotation.