In recent years, flying robots such as autonomous quadrocopters have gained increased interest in robotics and computer vision research.
For navigating safely, these robots need the ability to localize themselves autonomously using their onboard sensors. Potential applications of such systems include the automatic 3D reconstruction of buildings, inspection and simple maintenance tasks, surveillance of public places as well as in search and rescue systems.
In this course, we will provide an overview of current techniques for 3D localization, mapping and navigation that are suitable for quadrocopters. This course will cover the following topics:
– necessary background on robot hardware, sensors, 3D transformations
– motion estimation from images (including interest point detection, feature descriptors, robust estimation, visual odometry, iteratively closest point)
– filtering techniques and data fusion
– non-linear minimization, bundle adjustment, place recognition, 3D reconstruction
– autonomous navigation, path planning, exploration of unknown environments
The lecture will be accompanied by a lab course where the students will implement their own visual navigation system. This course is an excellent preparation for a master thesis project in this area.
2) Linear algebra, geometry, sensors
3) State estimation
4) Robot control
5) Visual motion estimation
6) Structure from Motion
7) RANSAC, ICP, and SLAM
8) Dense reconstruction
9) Guest talks (Jakob Engel, Christian Kerl, Frank Steinbrücker/TUM)
10) Path planning and navigation
11) Guest talks (Friedrich Fraundorfer/ETH and Korbinian Schmid/DLR)
12)Exploration, coverage and benchmarking
Prerequisites: Basic math (algebra, stochastic)
Instructor: Jürgen Sturm
Note: This post contains affiliate links and I may be compensated if you click on them.