I'm trying to implement skeletal animation in my game engine but I'm running into problems when animating the bones in C++. The character animates but the torso and arms appear twisted. I'm following this OpenGL tutorial and expect the model to look like this but instead it looks like the video below. The vertices and bones are being parsed and uploaded properly but when I apply animations to the bones this happens.
I've assembled all the bones into an array and saved their parents to work through the bone hierarchy as was demonstrated in this tutorial. I believe the problem is somewhere in here because the model renders find in T-pose when I multiply the m_LocalMatrix instead of the AnimatedTransform in the first for loop. However, when I do this I must also multiply the Offset matrix for that bone when calculating the FinalTransformation matrix in the last for loop. Here is my code to animate the model:
std::vector<FMatrix> LocalTransforms( m_SkeletalMesh->Joints.size() );
std::vector<FMatrix> ModelTransforms( m_SkeletalMesh->Joints.size() );
for (uint32 i = 0; i < m_SkeletalMesh->Joints.size(); i++)
{
FJoint& joint = m_SkeletalMesh->Joints[i];
if (m_anim->m_KeyFrames.find( joint.m_Name ) != m_anim->m_KeyFrames.end())
{
uint32 KeyIndex = 0;
uint32 NextKeyIndex = 0;
for (uint32 i = 0; i < m_anim->m_KeyFrames[joint.m_Name].size() - 1; i++)
{
if (AnimationTimeTicks < m_anim->m_KeyFrames[joint.m_Name][i + 1].m_Timestamp)
{
KeyIndex = i;
break;
}
}
NextKeyIndex = KeyIndex + 1;
float t1 = m_anim->m_KeyFrames[joint.m_Name][KeyIndex].m_Timestamp;
float t2 = m_anim->m_KeyFrames[joint.m_Name][NextKeyIndex].m_Timestamp;
float DeltaTime = t2 - t1;
float Factor = (AnimationTimeTicks - (float)t1) / DeltaTime;
HE_ASSERT( Factor >= 0.f && Factor <= 1.f );
FTransform& Start = m_anim->m_KeyFrames[joint.m_Name][KeyIndex].m_AnimatedTransform;
FTransform& End = m_anim->m_KeyFrames[joint.m_Name][NextKeyIndex].m_AnimatedTransform;
FTransform AnimatedTransform = FTransform::Interpolate( Start, End, Factor );
LocalTransforms[i] = AnimatedTransform.GetLocalMatrix();
}
else
{
LocalTransforms[i] = joint.m_LocalMatrix;
}
}
ModelTransforms[0] = LocalTransforms[0];
for (uint32 i = 1; i < m_SkeletalMesh->Joints.size(); i++)
{
FJoint& joint = m_SkeletalMesh->Joints[i];
ModelTransforms[i] = ModelTransforms[joint.m_ParentIndex] * LocalTransforms[i];
}
JointCBData* pJointCB = m_SkeletalMesh->m_JointCB.GetBufferPointer();
for (uint32 i = 0; i < m_SkeletalMesh->Joints.size(); i++)
{
FJoint& joint = m_SkeletalMesh->Joints[i];
FMatrix& FinalTransform = pJointCB->kJoints[i];
FinalTransform = m_SkeletalMesh->m_GlobalInverseTransform * ModelTransforms[i];
}
I'm confident the hierarchy is being parsed correctly but here is my code to do so:
void ProcessJoints( std::vector<FJoint>& Joints, const aiNode* pNode, const uint32& ParentIndex, const aiMesh* pMesh )
{
FJoint& joint = Joints.emplace_back();
strcpy_s( joint.m_Name, sizeof( joint.m_Name ), pNode->mName.C_Str());
joint.m_NameHash = StringHash( joint.m_Name );
joint.m_ParentIndex = ParentIndex;
memcpy( &joint.m_LocalMatrix, &pNode->mTransformation, sizeof( FMatrix ) );
for (uint32 i = 0; i < pMesh->mNumBones; i++)
{
if (pNode->mName == pMesh->mBones[i]->mName)
{
memcpy( &joint.m_OffsetMatrix, &pMesh->mBones[i]->mOffsetMatrix, sizeof( FMatrix ) );
break;
}
}
uint32 NewParentIndex = (uint32)Joints.size() - 1u;
for (uint32 i = 0; i < pNode->mNumChildren; i++)
{
ProcessJoints( Joints, pNode->mChildren[i], NewParentIndex, pMesh );
}
}
Here is my shader code as well which is taken from Frank Luna's DirectX12 book:
float weights[4] = { 0.0f, 0.0f, 0.0f, 0.0f };
weights[0] = Input.Weights[0];
weights[1] = Input.Weights[1];
weights[2] = Input.Weights[2];
weights[3] = 1.0f - weights[0] - weights[1] - weights[2];
float3 totalLocalPos = float3(0, 0, 0);
for (int i = 0; i < HE_MAX_JOINTS_PER_VERTEX; i++)
{
totalLocalPos += weights[i] * mul( float4(Input.Position, 0), Joints[Input.JointIDs[i]] ).xyz;
}
Result.Position = mul( float4(totalLocalPos, 1), WorldViewProjection );
I've been banging my head against this problem for weeks now and just cannot find the problem. Any help would be appreciated!
Other things I tried with no success were:
- Update Assimp to the latest version thinking that might be the problem with the Assimp library
- Made sure the vertices has the proper data types which was the solution to this problem
- Made sure the matrices were in row-major in C++ then column-major for hlsl
- Different models with different animations. Each model I tried had the same stretched/deformed look to them
- Re-exporting the same models from blender as I saw some might cause issues without it like this tutorial
Assimp is majorly OpenGL-oriented, so making it work with DirectX, you have to make a few adjustments.
In the
Assimp::Importer::ReadFile()
, pass the Assimp Post-processing flags -aiProcess_MakeLeftHanded | aiProcess_FlipWindingOrder
. Set the property boolAI_CONFIG_IMPORT_FBX_PRESERVE_PIVOTS
totrue
and check as well - using the functionAssimp::Importer::SetPropertyBool()
.Also it appears that the vertices appear inverted while animating. So change the transformation or offset matrix from column major to row major to make it appear properly.
You can go through this link as well.