autonomous robotic manipulation - ujisanzp/master/autrobmanipulation-2010-1.pdf · 1 autonomous...
TRANSCRIPT
4
May 2010 Univ Huelva 7
OpenResearch
Lines
May 2010 Univ Huelva 8
Towards “e-Manufacturing”
“The UJI Industrial Robotics Telelaboratory” IROS
2008
5
May 2010 Univ Huelva 9
RFID and Visual Perception
RFID and Visual PerceptionSkyetek RFID reader
Mini-M1
Antenna(50 Ohms, 13.56 MHz)
Visual Interface
6
May 2010 Univ Huelva 11
GRASPINGEXECUTION(Robot Side)
PREDICTIVE INTERFACE(Virtual and Augmented Reality)
INPUT(Human Side)
Learning by Demonstration
May 2010 Univ Huelva 12
Assistive RobotsVideo-1
Video-2
7
May 2010 Univ Huelva 13
EURON
Special Interest Group
Manipulation and Grasping,Lightweight Manipulators
http://www.robot.uji.es/documents/manipulation/sig.html
Contact Persons
Claudio Melchiorri Rezia Molfino Pedro J [email protected] [email protected] [email protected]
9
May 2010 Univ Huelva 17
IURS-20066th International UJI Robotics School
http://www.robot.uji.es/research/events/iurs06/ September, 18-22, 2006
Bonaire Hotel, BenicàssimSPAIN
Summer School on Humanoid Robots
May 2010 Univ Huelva 18
http://www.robot.uji.es/lab/plone/events/iurs07/
G. RecataláUJI
Spain
D. KragicRIT
Sweden
C. BalaguerUC3MSpain
P.J. SanzUJI
Spain
G. Chair Program Chairs7th International UJI Robotics School
IURS-2007
September, 24-28, 2007Benicàssim (SPAIN)
“Assistive Robots”
Summer School on Assistive Robots
10
May 2010 Univ Huelva 19
EURON Summer Schools
Edited Materials
May 2010 Univ Huelva 20
Ongoing European
Projects
11
May 2010 Univ Huelva 21
• EYESHOTS Heterogeneous 3D Perception across Visual Fragments
• GRASP Emergence of Cognitive Grasping through Introspection, Emulation and Surprise
• TRIDENT Marine Robots and Dexterous Manipulation for Enabling Autonomous Underwater Multipurpose Intervention Missions
Grant No.: 217077Duration: 3 yearsStarting date:1.03.08
Principal investigators & expertise:Silvio P. Sabatini [1,5,7]
Giorgio Cannata [1,2,3]
Angel del Pobil [1,3,4]
Patrizia Fattori [6,9,10]
Fred Hamker [4,7,8]
Markus Lappe [6,7,8,9]
Marc Van Hulle [4,5,7][1] Robotics[2] Biomechanical models[3] Motor control[4] Machine learning[5] Computer vision[6] Experimental neuroscience[7] Theoretical neuroscience[8] Cognitive psychology[9] Psychophysics[10] Neurophysiology
EC FP7 STREP Project, Unit E5 "Cognitive Systems, Interaction, Robotics"
UNIVERSITA’ DEGLI STUDIDI GENOVA
ALMA MATERSTUDIORUM
-UNIVERSITA’DI BOLOGNA
Objective 1: Development of a robotic system for interactive visual stereopsis.
Objective 2: Development of a model of a multisensory egocentric representation of the 3D space.
Objective 3: Development of a model of human-robot cooperative actions in a shared workspace.
http://www.eyeshots.it/
12
Constructing a global awareness of the peripersonal space integrating visual, oculomotor and arm motor information:
•Build a computational model of reference frame transformation in the posterior parietal cortex
•Learn to fixate and/or reach toward nearby targets by applying the model to concurrent reaching and gazing actions
reachingmovements
ocular movements
Visual / oculomotor representation
Visual / oculomotor representation
Tactile information
Tactile information
Cyclopean vision
Cyclopean vision
DisparityDisparity
VersionVersion VergenceVergence Arm positionArm position
Body-centered
representation
Body-centered
representation
Visuomotorawareness
-800 -600 -400 -200 0 200 400 6000
200
400
600
800
1000
1200
1400
J1
J2
FP7 EyeshotsUJI contribution http://www.eyeshots.it/
The aim is the design of a cognitive system capable of performing grasping and manipulation tasks in open-ended environments, dealing with novelty, uncertainty and unforeseen situations.
“Emergence of Cognitive Grasping through Emulation, Introspection, and Surprise”
http://www.csc.kth.se/grasp/
FP7 GRASP
13
May 2010 Univ Huelva 25
FP7 Strep
Marine Robots and Dexterous Manipulation for Enabling Autonomous Underwater
Multipurpose Intervention Missions
Strategic Objective:ICT 2009 4.2.1
Cognitive Systems, Interaction, Robotics
ID 248497
http://www.irs.uji.es/trident
Duration: 36 monthsFunding: 3,248 Keuros
May 2010 Univ Huelva 26
The Consortium
14
May 2010 Univ Huelva 27
PHASE I (Survey) PHASE II (Intervention)
The Envisioned Concept
A potential application:Underwater archaeology
Target Identification
Amphorae recovery with HCMR’s HOV “THETIS”, at a depth of 495 meters using suction-pump (Images courtesy of Project KYTHNOS 2005, EUA, HCMR)
Amphorae recovery(using suction-pump)
http://www.hcmr.gr/listview3_el.php?id=896
15
May 2010 Univ Huelva 29
(USA, University of Hawaii, since 1997…)
A point of reference
May 2010 Univ Huelva 30
• Faculty
• Technical Staff
• ManagementAssistant
The current team
16
May 2010 Univ Huelva 31
Tactile sensors
JR3 12 DOF sensor to measure force, torque, and 6 degrees of accelerations
Available Resources
Schunk-Robotnik Arm
May 2010 Univ Huelva 32
Z
ZZ
α
ZMove in Z
High Force in Z
Move in Z and X Grasp book
α > 15 Book grasped
Turn in Y
Y
21 3 4
XX
Z
X
X
Fingertip
Move in −Z and X
“the UJI Librarian Robot”
17
May 2010 Univ Huelva 33
Available Resources
Robotnik hydraulic Robot Arm
Visual
Servoing
May 2010 Univ Huelva 34
http://www.robot.uji.es/documents/rauvi/
18
May 2010 Univ Huelva 35
http://www.robot.uji.es/documents/rauvi/
Envisioned concept
Meeting CSIP (UK) January 2010
May 2010 Univ Huelva 36
[García et al., 2010] “Increasing Autonomy within Underwater Intervention Scenarios: The User Interface Approach”. In Proc. of IEEE Systems Conference. San Diego, CA, USA, 2010.
http://www.robot.uji.es/documents/rauvi/
5DT Head Mounted Display (HMD)
5DT Data Glove Ultra Wireless Kit
19
May 2010 Univ Huelva 37
http://www.robot.uji.es/documents/rauvi/
Obstacle Detector
MRU FOG
DVL ImagingSonar
InternalSensors
USBL Sonar Camera
Navigator
Coordinator
Velocity Controller
…
Primitive 1
Primitive 2
Primitive n
...
Thruster 1 Thruster 6
Video Abstraction Layer
Sensor Abstraction
Layer
Perception Layer
Robot Abstraction Layer
Condition LayerAction Layer
Perception Update
Robot Control
Perception Layer
External perceptions
External perceptions
Architecture Abstraction Layer
Centralized Blackboard
Petri net Player
BarretHand
Gripper PA10 VirtualArm
FireWireCamera
VirtualCamera
File Force Tactile
Petri netMission File
MCL‐Compiler
MCLMission FileGraphical User Interface
[Palomeras et al., 2010] “A Distributed Architecture for Enabling Autonomous Underwater Intervention Missions”. In Proc. of IEEE Systems Conference. San Diego, CA, USA, 2010.
May 2010 Univ Huelva 38
Manipulation:Preliminary
Aspects
20
May 2010 Univ Huelva 39
Homunculus diagram of the motor cortex[M
acm
illan
, 50]
Manipulation:a Measure of the Intelligence?
A Partial Taxonomy of Human Grasps
[Cut
kosk
y&
Wri
ght,
86]
“Mod
elin
g M
anuf
actu
ring
Grip
s and
C
orre
latio
ns W
ith t
he D
esig
n o
f R
obot
ic H
ands
”
Increasing Powerand Object Size
Increasing Dexterity Decreasing Object Size
Gross task and
geometry
Detailed task and
geometry
21
The activities carried out by robotic devices in order to use and manipulate objects by means of physical interactions
Definition:
Robotic Manipulation
•Prehensile•Non-Prehensile•Dexterous
Manipulation Patterns:
May 2010 Univ Huelva 42
•The contact-level approach
•The Knowledge-based approach
Prehensile-Manipulation Approaches:
• Design of robotic hands
• Development of dexterous control techniques
• Application in service robotics
Main Issues:
Robotic Manipulation
22
Towards Service Robotics
Modern
Times
May 2010 Univ Huelva 44
Towards Service Robotics
HERMES
an IntelligentHumanoidRobot, Designed andTested forDependability
23
May 2010 Univ Huelva 45
“Development of a hand mechanismfor grasping fresh foods in a
supermarket”
Towards Service Robotics
Tomizawa
IROS’2006
University of Tsukuba The Remote Shopping System
Dexterous Hands
Barrett HandDLR – Institute of robotics Shadow Hand
NASA – RobonautUniversity of Bologna
Uni
vers
ity o
f Kar
lsru
he
24
May 2010 Univ Huelva 47
Prof. Hirzinger
DLR (German Aerospace Center)
DLR
ICRA2007
Man
ipul
atio
n in
EV
As(
Extra
Veh
icle
Act
iviti
es)
http://robonaut.jsc.nasa.gov/robonaut.html
26
May 2010 Univ Huelva 51
EURON Research Roadmap
Robot loadinghousehold devices
2010? UniversitätKarlsruhe
Armar-II (2002) Armar-III (2006)Armar (2000)
Prof. Dillmann
ICRA’2008
27
Robot helpinghandicapped people
2015?
ISAC (USA)
FRIEND(Germany)
1990’s 2000’s
Robot helpinghandicapped people
2015?
Honda P3 (Japan)
HRP-2 (Japan)
RI-MAN (Japan)
28
Siemens AG“DRESSMAN”
Fagor “”DRIRON”
1
25
43
Ironing robot2015?
t+
Berkeley, CAICRA’2010
Human-like dexterous
manipulation?•(EU) EUROP: “The Strategic Research Agenda for Robotics, 2009”
•(USA) “A Roadmap for US Robotics, 2009”
http://www.us-robotics.us/
http://www.robotics-platform.eu/sra
Recent Predictions about
29
May 2010 Univ Huelva 57
OVERVIEW
1. Visually-Guided Grasping (2D)1.1 visually-guided grasping (non dexterity)1.2 including dynamic scenarios (non dexterity)1.3 including learning capabilities and dexterity
2. Visually-Guided Grasping (3D)
3. Sensor-based Control Interaction3.1 planning of physical interaction tasks3.2 vision-force-tactile integration for robotic physical
interaction
4. The UJI Service Robot: A Case Study
30
May 2010 Univ Huelva 59
1. Visually-Guided Grasping (2D)
1.1 visually-guided grasping (non dexterity)[IMG-04-Sanz]
1.2 including dynamic scenarios (non dexterity)[IMG-04-Recatala]
1.3 including learning capabilities and dexterity[IMG-04-Morales]
May 2010 Univ Huelva 60
1.1 visually-guidedgrasping (non
dexterity)
31
May 2010 Univ Huelva 61
The Human hand capabilities
“The Intelligent Pinch”
“Precision Grasp” vs “Power Grasp”
May 2010 Univ Huelva 62
Two-Fingered vs Dexterous Robot Hands
DLR – Institute of robotics University of Karlsruhe
NASA – RobonautUniversity of Bologna
Industrial gripper:”UMI RT 100”
A Generic Model
32
May 2010 Univ Huelva 63
Visually-Guided Grasping
Determination ExecutionPlanning
Perception Reasoning Action
Geometric Knowledgee. g. symmetry, curvature,... [Bajcsy (93), Arkin (98),...]
Action-Oriented Perception
[Leyton (87), Blake (95),…]
May 2010 Univ Huelva 64
2D Visually-Guided Graspingwith Two-Fingered Hands
Global
Local
Grasp Stability
(unknown / unmodeled objects)
33
May 2010 Univ Huelva 65
Wrap GripPinch
• Types of Grasps [Tan & Schlimler, 93]
May 2010 Univ Huelva 66
• Force-Closure [Nguyen, 88]
Iff we can exert, through the set of contacts, arbitrary force
and moment on this object
P1 and P2 are known like “antipodal point grasps”
P1 2
f 1n
f 1t
f 2n f 2
tP
DEF
Geometric Interpretation
34
May 2010 Univ Huelva 67
• Stability Conditions, [Montana, 91]
1. The curvature (from object or fingers).
2. The distance between the grasping points.
3. The viscoelasticity from fingers or object.
4. The existence or not of force feedback.
May 2010 Univ Huelva 68
• Grasp Determination (GloSt)
Planar Grasping Characterization
fn
ft
= µfn
θ
Friction cones
Object surface
fn
Finger 1 Finger 2
Grasping line
= arctanθ μ( )Coulomb friction model
35
May 2010 Univ Huelva 69
• Grasp Determination (GloSt)
Unknown objects?
cog1cog2
cog
2D Image
Camera
Hole
xy
Optical axis
IMG04
May 2010 Univ Huelva 70
[Sanz et al., 2005] “Grasping the not-so-obvious: Vision-Based Object Handling for Industrial Applications”. IEEE Robotics and Automation Magazine
The Symmetry Knowledge and the Grasping Determination Problem
[Li et al., 2008] “Bilateral Symmetry Detection for Real-time Robotics Applications”. Int. J. of Robotics Research
36
May 2010 Univ Huelva 71
Preliminary Conclusions (GloSt)
•CSF permits the quantification of the symmetry degree of
a shape in a simple and efficient manner
• CSF makes easier the geometric reasoning necessary to
seek grasping points from 2D images of real objects
• The global system has proven to work with a broad set
of unknown objects, making real applications feasible
May 2010 Univ Huelva 72
• Grasp Determination (LocSt)
1. Extraction of Grasping Regions
2. Selection of Compatible Regions
3. Grasp Refinement
Main Stages of the LocSt Algorithm
37
May 2010 Univ Huelva 73
• Grasp Determination (LocSt)
• A “grasping region” is a segment of a contour which points have a curvature below the curvature threshold
A grasp region can be described as straight segment. All the points met the curvature stability condition
This description simplifies the further computation and reasoning
It reduces the complexity of the problem
May 2010 Univ Huelva 74
• Grasp Determination (LocSt)
GR1
GR2
GR3
GR4GR1 GR2 GR3 GR4
Example-1
38
May 2010 Univ Huelva 75
• Grasp Determination (LocSt)
GR1 GR2
GR3
GR4
GR5
GR6
GR8
GR9
GR10
GR11GR12
GR13GR14GR15GR16
GR17
GR7
Example-2
May 2010 Univ Huelva 76
• Grasp Determination (LocSt)Compatible Regions
39
May 2010 Univ Huelva 77
GloSt LocSt
Alle
n w
renc
hPi
ncer
s• Experimental Results (GloSt vs LocSt)
May 2010 Univ Huelva 78
• Experimental Results (GloSt vs LocSt)
Figu
re b
y [F
aver
j on
& P
once
(91)
]
GloSt
LocSt
40
May 2010 Univ Huelva 79
• Experimental Results (LocSt)
Types of grasps
Squeezing grasps Expansion grasps
May 2010 Univ Huelva 80
• Experimental Results (LocSt)
Squeezing grasps Expansion grasps
41
May 2010 Univ Huelva 81
• Experimental Results (LocSt)
May 2010 Univ Huelva 82
• Experimental Results (LocSt)
Figu
re b
y [F
aver
jon
& P
once
(91)
]
42
May 2010 Univ Huelva 83
• Experimental Results (LocSt)
Promoting Active Perception?
May 2010 Univ Huelva 84
Conclusions• Fast response in the computation grasping with a state-of-the-art technology has been reached
• This method is able to find solutions, including internal contours or expansion grasps
• Indirect benefits have been obtained that can be applied in other research domains (e.g. the use of Φ in pattern recognition algorithms)
• Ongoing research: – Extension towards dynamic scenarios – Extension towards dextrous manipulation (e.g. the BarrettHand)
2D Visually-Guided Graspingwith Two-Fingered Hands
43
May 2010 Univ Huelva 85
1.2 includingdynamic
scenarios(non dexterity)
May 2010 Univ Huelva 86
• Towards Dynamic Scenarios [Recatalá et al., 2002-04]
“Grasp Tracking”
Gabriel RecataláEmail: [email protected]
44
May 2010 Univ Huelva 87
• Towards Dynamic Scenarios
“Grasp Tracking”
video-03
• Towards Dynamic Scenarios“The Catching Problem”
Example from MIT
The MIT Whole Arm Manipulator (WAM)
The Fast Eye Gimbals (FEGs) mounted to a ceiling rafter
http://web.mit.edu/nsl/www/
45
May 2010 Univ Huelva 89
WAM Catching
of a Paper Airplane
http://web.mit.edu/nsl/www/
• Towards Dynamic Scenarios“The Catching Problem”
• Towards Dynamic Scenarios“The Catching Problem”
MinERVA PROJECT (TUM, Germany)
“Manipulating Experimental Robot with Visually-Guided Actions”
Looking for human-robot analogies in the catching problem
46
May 2010 Univ Huelva 91
1.3 including learning
capabilitiesand dexterity
May 2010 Univ Huelva 92
• Towards Dexterous Manipulation [Morales et al., 2002-05]
Univ. of Massachusetts
(USA)
Prof. Grupen
Antonio MoralesEmail: [email protected]
47
May 2010 Univ Huelva 93
• Towards Dexterous Manipulation
UMass Humanoid Torso
Two 7 d.o.f. arms.
Pan-tilt head.
Stereo camera system.
Force-torque sensor on fingertips.
Two three-fingered Barrett hands.
• Towards Dexterous Manipulation
Goal : Online vision-based grasping of unmodeled planar objects
1. Process the stereo images of the object
2. Generates a number of feasible candidate grips(See ICRA 2001, IROS 2002, IMG 2004)
3. Selects the grasp to execute
4. Executes the grip
48
May 2010 Univ Huelva 95
Generation of grasping tripletsStereo images →Object contour→ Grasping regions
Triplets of regions → triplets of grasping points
May 2010 Univ Huelva 96
• Towards Dexterous Manipulation
video-04
49
May 2010 Univ Huelva 97
More Results
May 2010 Univ Huelva 98
Which one?
Which one to execute? Why do prefer one to the others ?
50
A learning frameworkLearn through successive experiences the relation between the reliability of a grasp and its vision-based description.
• Abstract grasp characterization scheme.
• Practical measurement of reliability.
• A methodology for predicting the reliability of a grasp based on its similarity to past attempts.
• An active learning technique to select the next grasp to execute with the purpose of increasing the predictive performance of the accumulated experience.
May 2010 Univ Huelva 100
Vision-based grasp characterization
Based on nine high-level features
Contour curvature (CC)QCC
Triangle size (TS)QTS
Point arrangement (PA)QPA
Finger limit (FGL)QFGL
Real focus centering (RFC)QRFC
Finger spread (FS)QFS
Finger extension (FE)QFE
Real focus deviation (RFD)QRFD
Force line (FCL)QFCL
Feature Name#
Based on visual information.
Hand constrains included.
Invariant to location and orientation.
Physical meaning.
Reliability and robustness concern.
Object independent.
These features define the G-space (GS)
Siii Gqqg ∈= },...,{ 91
51
May 2010 Univ Huelva 101
Experimental reliability test
Finished the testA
Dropped during 3rd sequenceB
Dropped during 2nd sequenceC
Dropped during 1st sequenceD
Couldn't lift the objectE
DESCRIPTIONCLASS
Ω = { A, B, C, D, E }
For any grasp g, ωg ∈ ΩIMG’04
May 2010 Univ Huelva 102
Grasp reliability predictionGiven a grasp gq, computes the probability P of each reliability class, based on the results of K nearest neighbors weighted by distance.
∑
∑
∈
=∈
=
)(
)(
)(
)(
)|(
qj
i
qi
gKNNgj
gKNNgi
q dK
dK
gp ωωω
gq∈ QS
KNN(G) : K nearest neighbors of gq∈ QS
K(di): Kernel function, d : distance from gq to gi
Euclidean distance on QS
• KNN classification rule
52
May 2010 Univ Huelva 103
Experimental database•To obtain experimental data, we have carried out exhaustive series of grasp trials
• Four Objects
• A wide variety of grasp configurations on each object
• Twelve trials for each grasp (4 times in 3 different orientations)
• More than three hundred executed trials, on four data sets.
May 2010 Univ Huelva 104
Prediction performance
0.2230.415e
2.5%9.3%4
12.0%20.7%3
12.8%20.3%2
21.9%26.2%1
50.8%23.5%0
KNNRandomErrordistance
•The prediction error decrease with the size of training dataset.
• KNN improves the performance of the random prediction.
53
May 2010 Univ Huelva 105
Active learningAn “active learning” strategy selects the next action to execute with the aim of acquiring new knowledge of the problem.
next action = next grasp to execute
Exploration rule.• Uses a KNN prediction function• Given n candidates, • Chooses the candidate having a least confident prediction.
gi ⇒ ωi [p(ωi |gi)]
)|(minarg iig gpi
ω
May 2010 Univ Huelva 106
Active learningA Validation framework aimed at emulating the running in the real world, while measuring the improvement in the prediction performance.
Random selector defined for comparison. • The exploration is executed several times and the results are averaged
• The exploration procedure reach an optimum in a hundred trials.