Date of Graduation
Statler College of Engineering and Mineral Resources
Mechanical and Aerospace Engineering
John A. Christian
Autonomous systems have the potential to increase safety and productivity in a wide array of applications. Practical application of autonomous systems requires an accurate navigation solution, which can only be achieved with sufficient mathematical knowledge of the underlying sensor systems. Formulation and estimation of these sensor models is known as calibration. The following work overviews three classes of optical sensors used in autonomous navigation. Mathematical models are formulated for perspective cameras, omnidirectional cameras, and a system built of a camera and light detection and ranging sensor (LiDAR). Descriptions are provided for perspective camera calibration methods (namely calibration via checkerboard images, goniometric calibration, and calibration via star fields), omnidirectional camera calibration via checkerboard images, and calibration of a LiDAR-fisheye camera system via a planar fiducial array. These calibration methods were then verified experimentally
Hikes, Jacob J., "Calibration of Cameras and LiDAR for Applications in Autonomous Navigation." (2018). Graduate Theses, Dissertations, and Problem Reports. 8200.