How do we project from camera to lidar coordinate when both of the sensors share same coordinate systems?

723 Views Asked by At

I am working on object detection task. I am able to detect objects in kitti point cloud. I am trying to use the same code on my own point cloud dataset. In Kitti dataset the camera and lidar sensor share different coordinate systems. I have attached image here for reference. For camera the axis are (z,x,y) and for lidar the axis are (x,y,z).

KITTI sensor setup

For KITTI dataset they have also provided calibration information. I am able to understand the projection matrix for KITTI dataset.I went through few materials. Camera-Lidar Projection.

In the above link he has calculated projection matrix as:

R_ref2rect_inv = np.linalg.inv(R_ref2rect)
P_cam_ref2velo = np.linalg.inv(velo2cam_ref)
proj_mat = R_ref2rect_inv @ P_cam_ref2velo

My question is: For my dataset the sensor setup is almost same so we can say lidar and camera share the same coordinate system. So, in this case how do I project point from camera to lidar?

In other words, If my object center is (x=7.5, y=1.7, z=0.58) in camera coordinate system then how do I find the same point in Lidar pointcloud?

0

There are 0 best solutions below