Is the rotation matrix in Pybullet converting world coordinates into camera or camera to world?

578 Views Asked by At

I am working on a project, where I need to replace the renderings by pybullet with renders generated with pytorch3d.

I figured out that pybullet and pytorch3d have different definitions for the coordinate systems (see these links: pybullet, pytorch3d; x and z axes are flipped), and I accounted for that in my code. But I still have inconsistency in the rendered objects. I thought the problem could be that while pytorch3d expects a c2w rotation matrix (i.e. camera to world), pybullet could probably expect a w2c rotation matrix. However, I cannot find any documentation related to this. Has anyone ever encountered this problem, or maybe can give some useful hint on how to find out what exactly pybullet expects its rotation matrix to be?

Thanks!

1

There are 1 best solutions below

0
On BEST ANSWER

I assume you are talking about the viewMatrix expected by pybullet.getCameraImage(). This should indeed be a world-to-camera rotation matrix.

However, in pyBullet the camera is looking in negative z-direction while I usually expect it to be in positive one. I am compensating for this by adding a 180°-rotation around the x-axis:

rot_x_180 = np.array(
    [
        [1, 0, 0, 0],
        [0, -1, 0, 0],
        [0, 0, -1, 0],
        [0, 0, 0, 1],
    ]
)
tf_mat = rot_x_180 @ tf_world_to_camera
view_matrix = tf_mat.flatten(order="F")

where tf_world_to_camera is a homogeneous rotation matrix.