system integration and experimental results intelligent robotics research centre (irrc) department...

26
System Integration and System Integration and Experimental Results Experimental Results Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception and Robotic Manipulation Springer Tracts in Advanced Robotics Chapter 7 Chapter 7 Geoffrey Taylor Lindsay Kleeman

Upload: buck-hancock

Post on 16-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

System Integration and System Integration and Experimental ResultsExperimental Results

Intelligent Robotics Research Centre (IRRC)

Department of Electrical and Computer Systems Engineering

Monash University, Australia

Visual Perception and Robotic Manipulation

Springer Tracts in Advanced Robotics

Chapter 7Chapter 7

Geoffrey Taylor

Lindsay Kleeman

2Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

OverviewOverview

• Stereoscopic light stripe scanning

• Object Modelling and Classification

• Multicue tracking (edges, texture, colour)

• Visual servoing

• Real-world experimental manipulation tasks with an upper-torso humanoid robot

3Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

MotivationMotivation

• To enable a humanoid robot to perform manipulation tasks in a domestic environment:– A domestic helper for the elderly and disabled

• Key challenges:– Ad hoc tasks with unknown objects

– Robustness to measurement noise/interference

– Robustness to calibration errors

– Interaction to resolve ambiguities

– Real-time operation

4Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

ArchitectureArchitecture

5Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Light Stripe ScanningLight Stripe Scanning

• Triangulation-based depth measurement.

Stripe generatorCamera

Scannedobject

B

D

6Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Stereo Stripe ScannerStereo Stripe Scanner

• Three independent measurements provide redundancy for validation.

Leftcamera

L

Scannedobject

2b

RightcameraR

Laserdiode

X

xL xR

Left imageplane

Right imageplane

θ

7Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Reflections/Cross TalkReflections/Cross Talk

8Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Single Camera ResultSingle Camera Result

Single camera scanner Robust stereoscopic scanner

9Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

3D Object Modelling3D Object Modelling

• Want to find objects with minimal prior knowledge.– Use geometric primitives to represent objects

• Segment 3D scan based on local surface shape.

Surface type classification

10Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

SegmentationSegmentation

• Fit plane, sphere, cylinder and cone to segments.

• Merge segments to improve fit of primitives.

Raw scan Finalsegmentation

Surface typeclassification

Geometricmodels

11Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Object ClassificationObject Classification

• Scene described by adjacency graph of primitives.

• Objects described by known sub-graphs.

12Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Modeling ResultsModeling Results

• Box, ball and cup:

Raw colour/range scan Textured polygonal models

13Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Multi-Cue TrackingMulti-Cue Tracking

• Individual cues are only robust under limited conditions:– Edges fail in low contrast,

distracted by texture

– Textures not always available, distracted by reflections

– Colour gives only partial pose

• Fusion of multiple cues provides robust tracking in unpredictable conditions.

14Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Tracking FrameworkTracking Framework

• 3D Model-based tracking: models modelled from light stripe range data.

• Colour (selector), edges and texture (trackers) are measured simultaneously in every frame.

• Measurements fused in Extended Kalman filter:– Cues interact with state through measurement models

– Individual cues need not recover the complete pose

– Extensible to any cues/cameras for which a measurement model exists.

15Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Colour CuesColour Cues

• Filter created from colour histogram in ROI:– Foreground colours promoted in histogram

– Background colours supressed in histogram

Captured image used to generate filter

Output of resulting filter

16Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Edge CuesEdge Cues

Combine with colour to get silhouette edges

Sobel mask directional

edges

Fitted edges

Predicted projected edges

17Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Texture CuesTexture Cues

Rendered prediction Feature detector Matched templates

Outlier rejection Final matched features

18Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Tracking ResultTracking Result

19Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Visual ServoingVisual Servoing

• Position-based 3D visual servoing (IROS 2004).

• Fusion of visual and kinematic measurements.

20Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Visual ServoingVisual Servoing

• 6D pose of hand estimated using extended Kalman filter with visual and kinematic measurements.

• State vector also includes hand-eye transformation and camera model parameters for calibration.

21Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Grasping TaskGrasping Task

• Grasp a yellow box without prior knowledge of objects in the scene.

22Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Grasping TaskGrasping Task

23Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Pouring TaskPouring Task

• Pour the contents of a cup into a bowl.

24Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Pouring TaskPouring Task

25Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Smell ExperimentSmell Experiment

• Fusion of vision, smell and airflow sensing to locate and grasp a cup containing ethanol.

26Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

SummarySummary

• Integration of stereoscopic light stripe sensing, geometric object modelling, multi-cue tracking and visual servoing allows robot to perform ad hoc tasks with unknown objects.

• Suggested directions for future research:– Integrate tactile and force sensing

– Cooperative visual servoing of both arms

– Interact with objects to learn and refine models

– Verbal and gestural human-machine interaction