Robust visual inertial odometry for agile flight sensor fusion
Permanent link
https://hdl.handle.net/10037/34186Date
2024-07-05Type
Master thesisMastergradsoppgave
Author
Ferkic, EdvinAbstract
This thesis presents the development and validation of a robust Visual-Inertial Odometry (VIO) system tailored for agile unmanned aerial vehicles (UAVs) operating in GPS-denied environments. These environments often challenge traditional navigation systems due to their inherent limitations in urban canyons, indoors, or densely forested areas. The study introduces a novel visual odometry (VO) algorithm designed to efficiently handle variations in lighting and motion blur, integral for maintaining high operational accuracy in dynamic conditions. The integration of visual data with inertial measurements through simplistic yet effective fusion techniques aims to enhance state estimation precision, particularly under aggressive flight maneuvers typical in drone racing and search-and-rescue missions. Despite successfully achieving real-time processing capabilities and demonstrating substantial robustness across standard testing datasets, the system eschews more complex filtering approaches like Extended Kalman Filter (EKF) or Unscented Kalman Filter (UKF) due to their integration complexity and computational demands. This thesis provides valuable implementation insights, establishes a performance baseline for monocular VIO systems, and highlights the trade-offs involved in designing VIO systems for real-world applications. Future work will focus on incorporating advanced filtering techniques, integrating sophisticated frameworks like OpenVINS, and expanding the robustness of sensor fusion to further optimize performance in challenging operational environments.
Publisher
UiT Norges arktiske universitetUiT The Arctic University of Norway
Metadata
Show full item recordCollections
Copyright 2024 The Author(s)
The following license file are associated with this item: