Date of Graduation

2018

Document Type

Thesis

Degree Type

MS

College

Statler College of Engineering and Mineral Resources

Department

Mechanical and Aerospace Engineering

Committee Chair

James Gross

Committee Co-Chair

Yu Gu

Committee Member

John A. Christian

Abstract

Autonomous systems have the potential to increase safety and productivity in a wide array of applications. Practical application of autonomous systems requires an accurate navigation solution, which can only be achieved with sufficient mathematical knowledge of the underlying sensor systems. Formulation and estimation of these sensor models is known as calibration. The following work overviews three classes of optical sensors used in autonomous navigation. Mathematical models are formulated for perspective cameras, omnidirectional cameras, and a system built of a camera and light detection and ranging sensor (LiDAR). Descriptions are provided for perspective camera calibration methods (namely calibration via checkerboard images, goniometric calibration, and calibration via star fields), omnidirectional camera calibration via checkerboard images, and calibration of a LiDAR-fisheye camera system via a planar fiducial array. These calibration methods were then verified experimentally

Share

COinS