intensity-difference based monocular visual odometry for

50
Intensity-Difference Based Monocular Visual Odometry for Planetary Rovers: A Case Study Geovanni Martínez Image Processing and Computer Vision Lab. (IPCV-LAB) School of Electrical Engineering, University of Costa Rica March 10, 2016

Upload: others

Post on 06-Feb-2022

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Intensity-Difference Based Monocular Visual Odometry for

Intensity-Difference Based Monocular Visual Odometry for Planetary Rovers: A Case Study

Geovanni Martínez

Image Processing and Computer Vision Lab. (IPCV-LAB) School of Electrical Engineering, University of Costa Rica

March 10, 2016

Page 2: Intensity-Difference Based Monocular Visual Odometry for

2

Overview

•  Introduction •  Problem •  Current approach •  Our approach •  Results •  Summary

Page 3: Intensity-Difference Based Monocular Visual Odometry for

3

•  These rovers have proven to be very powerful and long lasting tools for Mars exploration due to their ability to navigate and perform activities autonomosly

NASA JPL six-wheel rocker-bogie mobile robots for Mars surface exploration

Courtesy NASA/JPL-Caltech

Introduction

Page 4: Intensity-Difference Based Monocular Visual Odometry for

4

•  For safe and precise autonomous navigation, the rover must know its exact position and orientation at any time

Introduction

Page 5: Intensity-Difference Based Monocular Visual Odometry for

5

Introduction

•  The rover´s position is obtained by integrating its translation ΔT over time

•  ΔT is estimated from encoder readings of how much the wheels turned (wheel odometry)

Page 6: Intensity-Difference Based Monocular Visual Odometry for

6

Problem

•  Wheel odometry fails on slippery environments, such as steep slopes and sandy terrain, because the wheels slip due to the loss of traction

Courtesy NASA/JPL-Caltech

Page 7: Intensity-Difference Based Monocular Visual Odometry for

7

•  This could cause the rover to deviate from its desired path and this in turn could cause the loss of an entire day of scientific activity, trap the vehicle in hazardous terrain, or even damage the hardware

Problem

Page 8: Intensity-Difference Based Monocular Visual Odometry for

8

Slip detection and compensation •  Estimate the rover´s motion B from

two video signals delivered by a stereo video camera mounted on the rover by applying a feature based stereo visual odometry algorithm

•  Compute the rover’s position by accumulating the motion estimates B over time

•  Detect and compensate any slip that may occur by using the computed rover’s position

Current Approach

^

Page 9: Intensity-Difference Based Monocular Visual Odometry for

9

Stereo Visual Odometry

Algorithm

Page 10: Intensity-Difference Based Monocular Visual Odometry for

10

Stereo Visual Odometry

1.  Capture a first stereo image pair before the rover’s motion

Page 11: Intensity-Difference Based Monocular Visual Odometry for

11

Stereo Visual Odometry

2.  Estimate the 3D positions of a first set of feature points from the first stereo image pair by using stereo triangulation

First set of 3D feature point positions

Page 12: Intensity-Difference Based Monocular Visual Odometry for

12

Stereo Visual Odometry

3.  Allow the rover to move

Page 13: Intensity-Difference Based Monocular Visual Odometry for

13

Stereo Visual Odometry

3.  Allow the rover to move

Page 14: Intensity-Difference Based Monocular Visual Odometry for

14

Stereo Visual Odometry

3.  Allow the rover to move

Page 15: Intensity-Difference Based Monocular Visual Odometry for

15

Stereo Visual Odometry

3.  Allow the rover to move

Page 16: Intensity-Difference Based Monocular Visual Odometry for

16

Stereo Visual Odometry

3.  Allow the rover to move

Page 17: Intensity-Difference Based Monocular Visual Odometry for

17

Stereo Visual Odometry

3.  Allow the rover to move

6 motion parameters: B=(ΔT, ΔΩ)T

ΔΩ

ΔT

Page 18: Intensity-Difference Based Monocular Visual Odometry for

18

Stereo Visual Odometry

4.  Capture a second stereo image pair after rover´s motion

Page 19: Intensity-Difference Based Monocular Visual Odometry for

19

Stereo Visual Odometry

5.  Estimate the 3D positions of a second set of feature points from the second stereo image pair by using stereo triangulation

Second set of 3D feature point positions

Page 20: Intensity-Difference Based Monocular Visual Odometry for

20

Stereo Visual Odometry

6.  Establish the 3D correspondences (3D offsets) between the two sets of 3D feature point positions before and after the rover’s motion

3D offsets

Page 21: Intensity-Difference Based Monocular Visual Odometry for

21

Stereo Visual Odometry

7.  Search for those motion parameters B’ that move rigidly the first set of 3D features point positions (before rover´s motion) to that place where the 3D offsets become as small as possible

Page 22: Intensity-Difference Based Monocular Visual Odometry for

22

Stereo Visual Odometry

7.  Search for those motion parameters B’ that move rigidly the first set of 3D features point positions (before rover´s motion) to that place where the 3D offsets become as small as possible

ΔΩ’

Page 23: Intensity-Difference Based Monocular Visual Odometry for

23

Stereo Visual Odometry

7.  Search for those motion parameters B’ that move rigidly the first set of 3D features point positions (before rover´s motion) to that place where the 3D offsets become as small as possible

The parameters B’ are searched by maximizing a likelihood function of the established 3D correspondences

ΔT’

3D offsets become as small as possible

ΔΩ’

Page 24: Intensity-Difference Based Monocular Visual Odometry for

24

Stereo Visual Odometry

7.  Search for those motion parameters B’ that move rigidly the first set of 3D features point positions (before rover´s motion) to that place where the 3D offsets become as small as possible

The rover’s motion estimates are B=-B’

ΔT’

ΔΩ’

^

Page 25: Intensity-Difference Based Monocular Visual Odometry for

25

Slip detection and compensation •  Estimate the rover´s motion B from

the video signal delivered by a single camera mounted on the rover by applying an intensity-difference based monocular visual odometry algorithm

•  Compute the rover’s position by accumulating the motion estimates B over time

•  Detect and compensate any slip that may occur by using the computed rover’s position

Our Approach

^

Page 26: Intensity-Difference Based Monocular Visual Odometry for

26

Monocular Visual Odometry

Algorithm

Page 27: Intensity-Difference Based Monocular Visual Odometry for

27

Monocular Visual Odometry

1.  Capture a first intensity image before the rover’s motion

Page 28: Intensity-Difference Based Monocular Visual Odometry for

28

Monocular Visual Odometry

2.  Adapt the size, position and orientation of a generic surface model to the content of the first intensity image

In this contribution the generic surface model is a rigid and flat mesh of triangles consisting of only two triangles

Adapted surface model

Page 29: Intensity-Difference Based Monocular Visual Odometry for

29

Monocular Visual Odometry

3.  Select as observation points those image points in the first intensity image with high linear intensity gradients

Page 30: Intensity-Difference Based Monocular Visual Odometry for

30

Monocular Visual Odometry

3.  Select as observation points those image points in the first intensity image with high linear intensity gradients and attach them (together with their intensity values) rigidly to the surface model

Rigidly attached observation points

Page 31: Intensity-Difference Based Monocular Visual Odometry for

31

Monocular Visual Odometry

4.  Allow the rover to move

Page 32: Intensity-Difference Based Monocular Visual Odometry for

32

Monocular Visual Odometry

4.  Allow the rover to move

Page 33: Intensity-Difference Based Monocular Visual Odometry for

33

Monocular Visual Odometry

4.  Allow the rover to move

Page 34: Intensity-Difference Based Monocular Visual Odometry for

34

Monocular Visual Odometry

4.  Allow the rover to move

Page 35: Intensity-Difference Based Monocular Visual Odometry for

35

Monocular Visual Odometry

4.  Allow the rover to move

6 motion parameters: B=(ΔT, ΔΩ)T

ΔΩ

ΔT

Page 36: Intensity-Difference Based Monocular Visual Odometry for

36

Monocular Visual Odometry

5.  Capture a second intensity image after rover´s motion

Page 37: Intensity-Difference Based Monocular Visual Odometry for

37

Monocular Visual Odometry

6.  Project the observation points into the image plane and compute the intensity differences between their intensity values and the second intensity image

Intensity differences at the observation points

Page 38: Intensity-Difference Based Monocular Visual Odometry for

38

Monocular Visual Odometry

7.  Search for those parameters B’ that move the surface model (and therefore the rigidly attached observation points) to that place where the intensity differences become as small as possible

Page 39: Intensity-Difference Based Monocular Visual Odometry for

39

Monocular Visual Odometry

7.  Search for those parameters B’ that move the surface model (and therefore the rigidly attached observation points) to that place where the intensity differences become as small as possible

ΔΩ’

Page 40: Intensity-Difference Based Monocular Visual Odometry for

40

Monocular Visual Odometry

7.  Search for those parameters B’ that move the surface model (and therefore the rigidly attached observation points) to that place where the intensity differences become as small as possible

The parameters B’ are searched by maximizing a likelihood function of the intensity differences at the observation points

ΔT’

Intensity differences at the observation points become as small as possible

ΔΩ’

Page 41: Intensity-Difference Based Monocular Visual Odometry for

41

Monocular Visual Odometry

7.  Search for those parameters B’ that move the surface model (and therefore the rigidly attached observation points) to that place where the intensity differences become as small as possible

The rover’s motion estimates are B=-B’

ΔT’

ΔΩ’

^

Page 42: Intensity-Difference Based Monocular Visual Odometry for

42

Monocular Visual Odometry

Innovations •  A single camera used instead a stereo camera •  Evaluation of intensity differences instead of feature

point correspondences •  Direct estimation method, no intermediate steps

required

Page 43: Intensity-Difference Based Monocular Visual Odometry for

43

•  Implemented in C •  Tested in a real rover

platform (Husky A200) •  Outdoor sunlit conditions

–  Even under severe global illumination changes

•  343 experiments –  Over flat paver sidewalks

•  Performance measure –  Absolute position error of

distance traveled

Experimental Results

Page 44: Intensity-Difference Based Monocular Visual Odometry for

44

•  Paths and velocity –  Straight or semicircular

paths –  3 cm/sec constant velocity

•  Video Camera –  Rigidly attached to the rover –  15 fps –  77 cm above the ground –  Looking to the left side of the

rover tilted downwards 37° –  640x480 pixel2, 43° FOV

Experimental Results

Captured intensity image

Page 45: Intensity-Difference Based Monocular Visual Odometry for

45

•  Ground truth –  Robotic total station –  Tracks a prism rigidly

attached to the rover –  Delivers its 3D position with

high precision (<5 mm) every second

•  Comparison of ground truth trajectory and visual odometry trajectory

Experimental Results

Page 46: Intensity-Difference Based Monocular Visual Odometry for

46

Experimental Results

Experiment No. 337

Page 47: Intensity-Difference Based Monocular Visual Odometry for

47

•  The absolute position error was 0.9% of the distance traveled on average

•  The processing time was 0.06 seconds per image on average •  These results are very similar to those achieved by traditional

feature based stereo visual odometry algorithms which are within the range of 0.15% and 2.5% of distance traveled

Experimental Results

Page 48: Intensity-Difference Based Monocular Visual Odometry for

48

•  The outdoor tests will continue •  The algorithm will be tested over

different types of terrain geometries and weather conditions

•  We will use an Adept Mobilerobots Seekur Jr all-terrain rover platform, which is able to operate outdoors in any weather

Future work

Page 49: Intensity-Difference Based Monocular Visual Odometry for

49

Summary

•  Monocular visual odometry proposed •  Alternative algorithm •  Use of a single camera •  Evaluation of intensity differences for motion estimation •  Direct estimation method

•  Until now it has been tested in a real rover platform in outdoor sunlit conditions, under severe global illumination changes, over flat terrain •  Very encouraging results

Page 50: Intensity-Difference Based Monocular Visual Odometry for

50

Thanks!

Any question?