dtam: dense tracking and mapping in real-time newcombe, lovegrove & davison iccv11

19
DEPARTMENT OF ENGINEERING SCIENCE DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11 Amaury Dame Active Vision Lab Oxford Robotics Research Group [email protected]

Upload: lorie

Post on 14-Feb-2016

188 views

Category:

Documents


0 download

DESCRIPTION

DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11. Amaury Dame Active Vision Lab Oxford Robotics Research Group [email protected]. Introduction. Input : Single hand held RGB camera. Objective : Dense mapping Dense tracking. Input image. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

DEPARTMENT OF ENGINEERING SCIENCE

DTAM: Dense Tracking and Mapping in Real-Time

Newcombe, Lovegrove & Davison ICCV11

Amaury DameActive Vision Lab

Oxford Robotics Research [email protected]

Page 2: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 2

Introduction

Input :

• Single hand held RGB camera

Objective :

• Dense mapping• Dense tracking

Input image

3D dense map

Page 3: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 3

System overview

Page 4: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 4

Depth map estimationPrinciple:

• S depth hypothesis are considered for each pixel of the reference image Ir

• Each corresponding 3D point is projected onto a bundle of images Im

• Keep the depth hypothesis that best respects the color consistency from the reference to the bundle of images

Formulation:

• : pixel position and depth hypothesis• : number of valid reprojection of the pixel in the bundle• : photometric error between reference and current image

Page 5: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 5

Depth map estimation

Reprojection of depth hypotheses on one image of

bundle

Example reference image pixel

Depth hypotheses

Rep

roje

ctio

n in

im

age

bund

leP

hoto

err

or

Page 6: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 6

Depth map filtering approach

Problem:

• Uniform regions in reference image do not give discriminative enough photometric error

Idea:• Assume that depth is smooth on uniform regions• Use total variational approach where depth map is the

functional to optimize:– photometric error defines the data term– the smoothness constraint defines the regularization.

Page 7: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 7

Depth map filtering approach

Formulation:

• First term : regularization constraint, g is defined so that it is 0 for image gradients and 1 for uniform regions. So that gradient on depth map is penalized for uniform regions

• Second term : data term defined by the photometric error.• Huber norm: differentiable replacement to L1 norm that better

preserve discontinuities compared to L2.

Page 8: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 8

Total variational optimisationIm

age

deno

isin

gL2 norm L1 norm

Reg

ular

isat

ion

effe

ct

QU(f1)=1QU(f2)=0.1QU(f3)=0.01

TV(f1)=1TV(f2)=1TV(f3)=1

[Pock08]

Page 9: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 9

Depth map filtering approach

Formulation :

• Problem : optimizing this equation directly requires linearising of cost volume. Expensive and cost volume has many local minima.

Approximation :

• Introduce as an auxiliary variable, can be optimized with heuristic search

• Second terms brings original and auxiliary variable together

Page 10: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 10

Total variational optimisation

Classical approaches: • Time Marching Scheme: steepest descent method• Linearization of the Euler-Lagrange Equation

Problem: optimization badly conditioned as (uniform regions)

Reformulation of regularization with primal dual method• Dual variable p is introduced to compute the TV norm:

• Indeed:

Page 11: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 11

Increasing solution accuracy ?

Reminder:

Approach: • Q well modeled, perform Newton step on Q to

update estimation a• Equivalent to using Epsilon ?

Before

After one iteration

Page 12: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 12

Dense tracking

Inputs: • 3D texture model of the scene• Pose at previous frame

Tracking as a registration problem• First inter-frame rotation estimation : the previous image is aligned

on the current image to estimate a coarse inter-frame rotation• Estimated pose is used to project the 3D model into 2.5D image• The 2.5D image is registered with the current frame to find the

current pose.

Two template matching problems

Page 13: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 13

SSD optimisation

Problem: Align template image T(x) with input image I(x).

Hypothesis: Know a coarse approximation of the template position (p0).

Formulation: find the transformation that best maps the pixels of

the templates into the ones of the current image minimizing:

are the displacement parameters to be optimized.

Page 14: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 14

SSD optimisation

Problem: minimize

The current estimation of p is iteratively updated to reach the minimum of the function.

Formulations: • Direct additional

• Direct compositional

• Inverse

Page 15: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 15

SSD optimisation

Example: Direct additive method

• Minimize :

• First order Taylor expansion:

• Solution:

with:

Page 16: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 16

SSD robustified

Problem: In case of occlusion, the occluded pixels cause the optimum of the function to be changed. The occluded pixels have to be ignored from the optimization

Reminder:

Method :• Only the pixels with a difference

lower than a threshold are selected.• Threshold is iteratively updated to get more selective

as the optimization reaches the optimum.

Page 17: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 17

Template matching

Applications to DTAM:• First rotation estimation:the template is the previous image that is matched with current image. Warp is defined on

the space of all rotations. The initial estimate of p is identity.

• Full pose estimationtemplate is 2.5D, warp is defined by full 3D motion estimation, that is .The initial pose is given by the pose estimated at the previous frame and the inter frame

rotation estimation.

Page 18: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 18

Conclusion

• First live full dense reconstruction system...

• Limitation from the smoothness assumption on depth...

Page 19: DTAM: Dense Tracking and Mapping in Real-Time Newcombe, Lovegrove & Davison ICCV11

28.02.2013Amaury DameActive Vision LabOxford Robotics Research Group

Slide 19

Important references

• [Pock Thesis08] Fast total variation for Computer Vision• [Baker IJCV04] Lucas-Kanade 20 years on: A unifying framework