low-latency event-based visual odometry · ‣dvs: interesting new sensor for agile robots! ‣...

Post on 17-Aug-2020

9 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

1

Andrea Censi 1

Low-Latency Event-Based Visual OdometryDavide Scaramuzza 2

Acknowledgments: Jonas Strubel, Christian Brandli, Tobi Delbruck, Matia Pizzoli, Christian Forster.

1 Laboratory for Information and Decision Systems Massachusetts Institute of Technology

2 Department of Informatics University of Zurich

slides available at censi.mit.edu/slides

2

Low-Latency Event-Based Visual Odometry

3

Low-Latency Event-Based Visual Odometry

recovering camera motion

4

Low-Latency Event-Based Visual Odometry

brightness increased

brightness decreased

using a “neuromorphic” sensor

4

Low-Latency Event-Based Visual Odometry

brightness increased

brightness decreased

using a “neuromorphic” sensor

5

‣ What is a “neuromorphic” sensor?!

‣ Why would you want to use in robotics?

Come to the “Non-traditional Cameras” workshop on Thursday for technical details!

6

7

Slide by Christian Brandli

‣ Traditional camera: frames at fixed intervals.

8

timeframe

‣ Traditional camera: frames at fixed intervals.

8

timeframe next frame

9

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

brightness

time

9

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

brightness

time

9

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

brightness

time

9

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

brightness

time

9

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

brightness

time

9

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

brightness

time

9

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

brightness

time

9

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

brightness

time

9

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

brightness

time

9

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

brightness

time

9

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

brightness

time

9

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

brightness

time

9

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

brightness

time

10

time

events stream

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

10

time

events stream

k-th event:

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

10

time

events stream

timestamp (1 µs resolution)

k-th event:

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

10

time

events stream

timestamp (1 µs resolution)

pixel coordinates

k-th event:

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

10

time

events stream

timestamp (1 µs resolution)

pixel coordinates

sign (+1 or -1)

k-th event:

‣ “Neuromorphic” sensor: events generated any time a single pixel sees a change in brightness.

‣ “Neuromorphic” sensors:!- Event-based !- Asynchronous and distributed

per-pixel processing!- Analog computation

11

Liu, Delbruck. Neuromorphic Sensory Systems. 2010 DOI:10.1016/j.conb.2010.03.00

‣ “Neuromorphic” sensors:!- Event-based !- Asynchronous and distributed

per-pixel processing!- Analog computation

11

Liu, Delbruck. Neuromorphic Sensory Systems. 2010 DOI:10.1016/j.conb.2010.03.00

‣ “Dynamic Vision Sensor” (DVS): a neuromorphic sensor developed by Delbruck’s group (ETH).!- Commercially available ($2.5k).

12

Lichtsteiner, Posch, Delbruck. A 128x128 120 dB 15µs Latency Asynchronous Temporal Contrast Vision Sensor. 2008 DOI:10.1109/JSSC.2007.914337

13

‣ Earliest “silicon retina” developed by Carver Mead at Caltech.

Mead. Neuromorphic electronic systems. IEEE Proceedings, 1990!

‣ The DVS is fast!

14

‣ The DVS is fast!

14

Mueggler et al., IROS 2014

‣ The DVS is fast!

14

CMOS

Mueggler et al., IROS 2014

‣ The DVS is fast!

14

CMOS DVS

Mueggler et al., IROS 2014

‣ The DVS is fast!

14

CMOS DVS

‣ DVS visualization: events histogram in a time slice.

Mueggler et al., IROS 2014

15

‣ The DVS is really fast!

15

‣ The DVS is really fast!

16

1 video frame = 33 ms

1 video frame = 1 ms

1 video frame = 0.05 ms

‣ The DVS is really fast!

16

1 video frame = 33 ms

1 video frame = 1 ms

1 video frame = 0.05 ms

‣ The DVS is really fast!

17

“fast” sensorslow-latency control loop

commands

‣ Goal: low-latency control architecture.

17

“fast” sensorslow-latency control loop

commands

“slow” sensors

‣ Goal: low-latency control architecture.

17

“fast” sensorslow-latency control loop

commands

“slow”, “cognitive” level

“context”

“slow” sensors

‣ Goal: low-latency control architecture.

18

DVS

CMOS camera

19

“slow”, “cognitive” level

low-latency control loop

“context”

DVS

CMOS camera

19

“slow”, “cognitive” level

low-latency control loop

“context”

DVS

CMOS camera

20

Event-Based Visual Odometry

‣ Goal: estimate the camera motion from the events.

‣ Large solution space!- use additional sensors? !

this work: yes, use additional CMOS sensor!- use “features”? !

this work: no, use raw events

20

Event-Based Visual Odometry

‣ Goal: estimate the camera motion from the events.

21

(calibration details in the Thursday workshop)

DVS events timeCMOS frames

21

(calibration details in the Thursday workshop)

DVS events timeCMOS frames

21

(calibration details in the Thursday workshop)

DVS events timeCMOS frames

22

DVS events timeCMOS frames

22

- one event doesn’t tell much!- low resolution (128 × 128)!

DVS events timeCMOS frames

22

- one event doesn’t tell much!- low resolution (128 × 128)!

DVS events timeCMOS frames

- you see incremental motion!- you can decouple the motion along the axes

23

DVS events timeCMOS frames

23

‣ Idea: reduce the problem to “localization” with respect to the previous CMOS frame.

DVS events timeCMOS frames

24

The 1D case — pure rotation

time

pixel

24

The 1D case — pure rotation

time

pixel

24

The 1D case — pure rotation

time

pixel

24

The 1D case — pure rotation

time

pixel

24

The 1D case — pure rotation

time

pixel

estimated velocity

24

The 1D case — pure rotation

time

pixel

estimated velocity

24

The 1D case — pure rotation

time

pixel

estimated velocity

24

The 1D case — pure rotation

time

pixel

estimated velocity

24

The 1D case — pure rotation

time

pixel

estimated velocity

24

The 1D case — pure rotation

time

pixel

estimated velocity

24

The 1D case — pure rotation

time

pixel

estimated velocity

24

The 1D case — pure rotation

time

pixel

estimated velocity

24

The 1D case — pure rotation

time

pixel

estimated velocity

25

estimated velocity

25

estimated velocity

‣ Very good accuracy for rotation !drift comparable to visual odometry

26

‣ Very good accuracy for rotation !drift comparable to visual odometry

26

1.0°-8.9°

2.1°

2.1°

90°

odometry estimate

truth0°

‣ Very good accuracy for rotation !drift comparable to visual odometry

26

‣ Poor tracking of translation!very low SNR due to small apparent motion!(also: needs to know the image’s depth map)

1.0°-8.9°

2.1°

2.1°

90°

odometry estimate

truth0°

‣ DVS: Interesting new sensor for agile robots!

‣ Goal: low-latency control architectures!

‣ So far: worked on inference problems!- blinking LEDs tracking (IROS 2013)!- visual odometry w/CMOS fusion (this work)!- feature-based visual odometry (Mueggler et al., IROS 2014)

27

Summary

Come to the “Non-traditional Cameras” workshop on Thursday for technical details!

‣ DVS: Interesting new sensor for agile robots!

‣ Goal: low-latency control architectures!

‣ So far: worked on inference problems!- blinking LEDs tracking (IROS 2013)!- visual odometry w/CMOS fusion (this work)!- feature-based visual odometry (Mueggler et al., IROS 2014)

27

Summary

Come to the “Non-traditional Cameras” workshop on Thursday for technical details!

top related