I am very new to animation and rendering softwares, so please let me know if I need to provide more information about this. I have a sequence of 3D positions of human joints (basically mocap data), representing different kinds of walking. I have managed to visualize the sequence using python, as I have shown in this video. Each data I have is a numpy array of size TxJx3, where T is the number of frames, J is the number of joints (21 in my case), and 3 represents the 3 co-ordinate values. So my question is, how can I convert these 3D positions into a BVH file, that I can load into blender? Or convert them to any other format so that I can load these data in blender?
How to convert 3D pose sequences to the bvh file format?
5.3k Views Asked by mauve127 At
1
There are 1 best solutions below
Related Questions in NUMPY
- How can I serialize a numpy array while preserving matrix dimensions?
- store numpy array in mysql
- Trying to save an np array with string and floats, but getting a error
- Does numpy broadcast in *all* of its functions?
- Is there any implementation of hstack in gnumpy
- Mayavi - color vectors based on direction instead of magnitude
- How to install scipy misc package
- Numpy Vs nested dictionaries, which one is more efficient in terms of runtime and memory?
- How to count distance to the previous zero in pandas series?
- Changing the amount of points changes the result of the fft
- Succint way of handling missing observations in numpy.cov?
- Fastest Way to access and put values in matrix
- Enforcing that inputs sum to 1 and are contained in the unit interval in scikit-learn
- Cython speed vs numpy
- Best practice for using common subexpression elimination with lambdify in SymPy
Related Questions in BLENDER
- Threejs scene from blender and access to objects
- Importing blender JSON in Three.js
- Blender to Unity3D: Applying Materials in Unity
- Blender Python Script Deleting Meshes
- Playing background movie while animation in blender
- How to use toon shader to convert 3D models to patent drawings
- How to export a .bullet file from blender
- Blender GLSL Export to THREE.js
- Using Blender camera Paths in Threejs
- SceneKit + Collada + animation
- Rendering a scene created in Blender with Babylon.js without the Babylon.js default shiny material
- Json to CSV using python and blender 2.74
- THREE.js + leap motion
- Automatically creating python script from GUI in Blender
- XML3D: Exporting animations from blender
Related Questions in MO-CAP
- Processing raw Mocap data into a Skeletal Joints (BVH)
- Avatar Pupettering with ThreeJS, ReadyPlayerMe, Kalidokit and MediaPipe
- Blender Import BVH file and render this skeleton to make video
- Visualize Moiton Capture data with Python
- Rotation conversion between OpenGL and Unity, does rotation order matter?
- How to make these Mocap Animation FBX's work with Unity?
- Unity3D: Are motion captured animations alot less compatible on mobile?
- How to get Nao robot's joint coordinates?
- Is it a good idea to use Drake for motion capture problems
- ARKit – Body Tracking inaccuracy
- MCS Character mouth remains open after assigning mocap
- Create animation videos from 3D joint positions
- Calculate angles from 3d skeleton points for BVH file
- Mocap animation is tilted only upon import into Unity?
- "Avatar creation failed: Transform 'Hips' for human bone 'Hips' not found" when trying to choose Humanoid Rig for mocap animation FBX
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
OK, found the solution myself. Posting here in case anyone else finds this useful. Please excuse the absence of LaTeX rendering, apparently, stack overflow does not support it (yet), and I'm too new here to be able to attach images.
So, in the BVH format, the following relationship holds between the joints:
$$pos_j = R_{P(j)}offset_j + pos_{P(j)}$$
where $pos_j$ indicates the 3D position of joint $j$, $P(j)$ returns the parent of joint $j$ in whatever DAG the positions are modeled in (generally the DAG starts at the root and points towards the end-effectors) $offset_j$ indicates the offset of joint $j$ relative to its parent $P(j)$ (aka the connecting limb), and $R_{P(j)}$ is the 3D rotation that determines how much should $offset_j$ be rotated from an initial pose (generally a T-pose). In the BVH format, for each parent $P(j)$, we need to store $R_{P(j)}^{-1}R_j$.
The main trouble I had then was working with joints that had multiple children, for example, the root joint, which has connections to both legs as well as the spine. I eventually came across this repo and digging through their function
forward_kinematicsinsideskeleton.py, realized what to do. Basically, for joints with multiple children, I had to make copies with $offset=0$, and assign those as parents of the corresponding chains. Thus I made 3 copies of the root: one became the parent for the left leg chain, one for the right leg chain, and one for the spine. And similarly for the other parents with multiple children. And yes, the visualization works great!