ground-robot map - Swiss Institute Of Navigation :: ION-CH

Vision-based Localization and Mapping with
Heterogeneous Teams of
Ground and Micro Flying Robots
Davide Scaramuzza
Robotics and Perception Group
University of Zurich
http://rpg.ifi.uzh.ch
All videos in this presentation can be found at:
YouTube Channel: https://www.youtube.com/ailabRPG/videos
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
My Research Group
rpg.ifi.uzh.ch
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Current Research: Computer Vision applied to
Autonomous Navigation of Micro Flying Robots
Air-ground collaboration
Vision-based Navigation of Flying Robots
Event-based Vision
Vision-based Control for aggressive flight
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Outline
 Autonomous GPS-denied Navigation
 Multi-robot systems


Homogeneous systems (air-air)
Heterogeneous systems (air-ground)
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Outline
 Autonomous GPS-denied Navigation
 Multi-robot systems


Homogeneous systems (air-air)
Heterogeneous systems (air-ground)
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Autonomous,
Vision-based Navigation in
GPS-denied Environments
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Why not GPS ?
 Does not work indoors
 Even outdoors it is not a reliable service
?
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Why not GPS ?
 Does not work indoors
 Even outdoors it is not a reliable service


Satellite coverage
Multipath problem
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Visual Odometry
How does it work?
What are good features to track?
Image 1
Image 2
𝑅, 𝑇
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Keyframe-based Visual Odometry
Keyframe 1
Keyframe 2
Initial pointcloud
Current frame
New keyframe
New triangulated points
[Forster, Pizzoli, Scaramuzza, «SVO: Semi Direct Visual Odometry», ICRA’14, RSS’15]
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Visual-Inertial Fusion
 Fusion is solved as a non-linear optimization problem (no Kalman filter):
 Increased accuracy over filtering methods
IMU residuals
Reprojection residuals
[Forster, Carlone, Dellaert, Scaramuzza, IMU Preintegration on Manifold for efficient Visual-Inertial
Maximum-a-Posteriori
Estimation,
RSS’15] Group - rpg.ifi.uzh.ch
University of Zurich – Robotics
and Perception
Visual Odometry
Accuracy: 0.1% of the travel distance
[Forster, Carlone, Dellaert, Scaramuzza, IMU Preintegration on Manifold for Efficient
Visual-Inertial
Maximum-a-Posteriori
Estimation,
University of Zurich
– Robotics and Perception
GroupRSS’15]
- rpg.ifi.uzh.ch
SmartPhone computer for image analysis
Quad Core Odroid (ARM Cortex A-9):
used in Samsung Galaxy S4 phones
Inertial sensors
170 grams!
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
HD camera
Global shutter
HDR
90 fps
Autonomous Vision-based Flight (no GPS)
[Forster, Pizzoli, Scaramuzza, «SVO: Semi Direct Visual Odometry», ICRA’14]
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Bayesian Estimation of the Depth Uncertainty
• Initialize a depth-filter for every new
feature
• Recursive Bayesian depth estimation
Measurement Likelihood models outliers:
University
of Zurich
– Robotics and
Perception
rpg.ifi.uzh.ch
[Forster,
Pizzoli,
Scaramuzza,
«SVO:
Semi Group
Direct-Visual
Odometry», ICRA’14]
Localization and Mapping with
Swarms of Micro Aerial Vehicles
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
C. Forster, S. Lynen, L. Kneip, D. Scaramuzza, Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles,
University
of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
IROS’13
]
C. Forster, S. Lynen, L. Kneip, D. Scaramuzza, Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles,
University
of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
IROS’13
]
C. Forster, S. Lynen, L. Kneip, D. Scaramuzza, Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles,
University
of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
IROS’13
]
C. Forster, S. Lynen, L. Kneip, D. Scaramuzza, Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles,
University
of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
IROS’13
]
Motivation
 Map faster an unknown environment
 Know relative position between robots
 Robust against single-robot failure
This problem is known in robotics as
Multi-Robot Visual SLAM (Simultaneous
Localization and Mapping)
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Comparison with State-of-the-Art
Co-SLAM [Danping and Ping, PAMI’12]
Our proposed Solution [IROS’13]
 Synchronized
 Asynchronous
 Known initial positions
 Unknown initial positions
 Fully centralized
 Distributed pre-processing
Cam 1
Cam 2
Map
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
VO 1
VO 2
Frame
Frame
Handler
Handler
Map
Place Recognizer
System Overview
MAVs
VO 1
VO 2
Ground
station
Map
Map
Distributed processing:
 Each MAV runs an onboard visual odometry and streams point features
and relative poses (1 Mbit/s instead of 90 Mbit/s for full frames at 30 Hz)
 The ground station computes local maps for each MAV, detects overlaps,
and merges different maps into global map
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
System Overview
MAVs
VO 1
VO 2
Ground
station
Map
Map
Place Recognizer
Distributed processing:
 Each MAV runs an onboard visual odometry and streams point features
and relative poses (1 Mbit/s instead of 90 Mbit/s for full frames at 30 Hz)
 The ground station computes local maps for each MAV, detects overlaps,
and merges different maps into global map
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
System Overview
MAVs
VO 1
VO 2
Ground
station
Map
Map
Place Recognizer
Distributed processing:
Map
 Each MAV runs an onboard visual odometry and streams point features
and relative poses (1 Mbit/s instead of 90 Mbit/s for full frames at 30 Hz)
 The ground station computes local maps for each MAV, detects overlaps,
and merges different maps into global map
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
System Overview
MAVs
VO 1
VO 2
Ground
station
Map
Map
Place Recognizer
Distributed processing:
Map
 Each MAV runs an onboard visual odometry and streams point features
and relative poses (1 Mbit/s instead of 90 Mbit/s for full frames at 30 Hz)
 The ground station computes local maps for each MAV, detects overlaps,
and merges different maps into global map
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Mapping on the Groundstation
MAV
𝑅, 𝑡
MAVs
VO 1
Ground
station
Map
Groundstation
C. Forster, S. Lynen, L. Kneip, D. Scaramuzza, Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles,
University
of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
IROS’13
]
Mapping on the Groundstation
MAV
𝑅, 𝑡
MAVs
VO 1
Ground
station
Map
Groundstation
C. Forster, S. Lynen, L. Kneip, D. Scaramuzza, Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles,
University
of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
IROS’13
]
Mapping on the Groundstation
MAV
𝑅, 𝑡
Groundstation
𝑅, 𝑡
Use motion
estimate from
Visual Odometry as
prior
C. Forster, S. Lynen, L. Kneip, D. Scaramuzza, Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles,
University
of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
IROS’13
]
Mapping on the Groundstation
MAV
Groundstation
𝑅, 𝑡
𝑅𝐵𝐴 , 𝑡𝐵𝐴
Refine pose w.r.t map
with Bundle Adjustment
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
g2o [Kümmerle et al., ICRA’11]
Place Recognition
MAVs
VO 1
VO 2
Map
Map
Place Recognizer
Ground
station
1. Appearance-based Detection
 Bag of Words image retrieval
[Sivic et al., 2005]
2. Geometric Verification
 3-point RANSAC for point-cloud alignment
3-point algorithm [Kneip & Scaramuzza,CVPR’11]
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Loop-Closure (single robot)
Indoor flight
7 DoF pose-graph optimization based on [Strasdat et al., RSS 2010]
C. Forster, S. Lynen, L. Kneip, D. Scaramuzza, Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles,
University
of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
IROS’13
]
Map Merging (multiple robots)
Indoor flight
C. Forster, S. Lynen, L. Kneip, D. Scaramuzza, Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles,
University
of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
IROS’13
]
Map Merging (multiple robots)
Outdoor flight
C. Forster, S. Lynen, L. Kneip, D. Scaramuzza, Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles,
University
of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
IROS’13
]
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Multi-robot – Dense Reconstruction
[Scaramuzza, Achtelik, Weiss, Fraundorfer, et al., Vision-Controlled Micro Flying Robots: from System
Design to Autonomous Navigation and Mapping in GPS-denied Environments, IEEE RAM, 2014]
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Air-Ground Collaboration
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Robots for Search and Recue?
 Search and rescue missions can benefit from robotic technologies (Fukushima,
Gotthard rock slide, Italy earthquake)
 Most robots move on the ground and are teleoperated by trained professionals
2011 - Fukushima
Nuclear Power Plant
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
2011 - Fukushima Nuclear Power Plant
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Collaboration between Ground and Aerial Robots
Air-ground exploration
Air-ground collaborative grasping
C. Forster, M. Pizzoli, D. Scaramuzza, Air-Ground Localization and Map Augmentation Using Monocular Dense
Reconstruction, IROS’13]
Mueggler, Faessler, Fontana, Scaramuzza, Aerial-guided Navigation of a Ground Robot among Movable
Obstacles,,
UniversitySSRR’14]
of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Air-Ground Localization
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
How can we achieve Mutual Localization?
 Two distinct approaches

Relative observations
- Ground to air
- Air to ground

Common scene observation
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Air-Ground Robot Localization based on Infrared LEDs
5 infrared LEDs
Camera with
infrared filter
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Air-Ground Robot Localization based on Infrared LEDs
Schwabe, Faessler, Mueggler, Scaramuzza, A Monocular Poese Estimation System based on Infrared LEDs , ICRA’14]
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Air-Ground Robot Localization based on Common Scene
Observation
Camera
Laser
Forster, Pizzoli, Scaramuzza, Air-Ground Localization and Map Augmentation Using Monocular Dense
University of Zurich
– Robotics and Perception Group - rpg.ifi.uzh.ch
Reconstruction,
IROS’13
Air-ground Collaborative Mapping
Forster, Pizzoli, Scaramuzza, Air-Ground Localization and Map Augmentation Using Monocular Dense
Reconstruction, IROS’13
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Challenge
 Radically different view-points!
 Different sensors
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Standard Feature Matching
Appearance-based relative localization with feature matching fails!
Some matches in
locally planar
scenes
Affine SIFT [Morel, SIAM’09]
No matches
Our Solution:
First build dense maps with each robot and then align the maps
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Ground Robot Mapping with Kinect
 Trajectory of the ground-robot
 Dense point-cloud
[Forster et al., «Semi-direct Visual Odometry», submitted to ICRA 2014]
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Ground Robot Mapping with Kinect
 Trajectory of the ground-robot
 Dense point-cloud
[Forster et al., «Semi-direct Visual Odometry», submitted to ICRA 2014]
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Aerial Mapping with Single Camera
Red points: ground-robot map
[Forster
et al.,
«Semi-direct
Visual Odometry», submitted
to ICRA
2014]Dense Reconstruction in Real Time, ICRA’14]
[M.University
Pizzoli,
C.of
Forster,
Probabilistic,
Monocular
ZurichD.– Scaramuzza,
Robotics andREMODE:
Perception
Group - rpg.ifi.uzh.ch
Initial Guess from Heightmap Correlation
Aerial
heigh map
Ground
heigh map
Correlation
Zero Mean Sum of Squared Differences (ZMSSD) cost:
[Forster et al., «Semi-direct Visual Odometry», submitted to ICRA 2014]
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Map Augmentation
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Air-Ground
Collaborative Grasping
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Aerial Guided Navigation among Movable Obstacles
Winner of KUKA Innovation Award at AUTOMATICA
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Aerial Guided Navigation among Movable Obstacles
Winner of KUKA Innovation Award at AUTOMATICA
Target location
Optimal path
Obstacle to remove
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Aerial-Guided Navigation among Movable Obstacles
Winner of KUKA Innovation Award at AUTOMATICA
Mueggler, Faessler, Fontana, Scaramuzza, Aerial-guided Navigation of a Ground Robot among Movable
University
of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Obstacles,
SSRR’14]
http://rpg.ifi.uzh.ch
All videos in this presentation can be found at:
https://www.youtube.com/ailabRPG/videos
Software and datasets:
http://rpg.ifi.uzh.ch/software_datasets.html
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch
Thanks! Questions?
Software: http://rpg.ifi.uzh.ch/software_datasets.html
Funding
University of Zurich – Robotics and Perception Group - rpg.ifi.uzh.ch