cs283: robotics fall 2017: localization · effector noise: odometry, deduced reckoning •odometry...
TRANSCRIPT
CS283: Robotics Fall 2017: Localization
Sören Schwertfeger / 师泽仁
ShanghaiTech University
LOCALIZATION
"Position" Global Map
Perception Motion Control
Cognition
Real WorldEnvironment
Localization
PathEnvironment ModelLocal Map
?
Map based localization?
• Odometry, Dead Reckoning• Localization base on external sensors,
beacons or landmarks• Probabilistic Map Based Localization
Observation
Mapdata base
Prediction of Position
(e.g. odometry)
Perc
eptio
n
Matching
Position Update(Estimation?)
raw sensor data or extracted features
predicted position
position
matchedobservations
YES
Encoder
Per
cept
ion
Robotics ShanghaiTech University - SIST - 08.11.2017 3
Exteroceptive Sensor Noise
• Sensor noise is mainly influenced by environment e.g. surface, illumination …
• and by the measurement principle itselfe.g. interference two Kinects
• Sensor noise drastically reduces the useful information of sensor readings. The solution is:• to model sensor noise appropriately• to take multiple readings into account• employ temporal and/or multi-sensor fusion
Robotics ShanghaiTech University - SIST - 08.11.2017 4
Effector Noise: Odometry, Deduced Reckoning
• Odometry and dead reckoning: Position update is based on proprioceptive sensors• Odometry: wheel sensors only• Dead reckoning: also heading sensors
• The movement of the robot, sensed with wheel encoders and/or heading sensors is integrated to the position.• Pros: Straight forward, easy• Cons: Errors are integrated -> unbound
• Using additional heading sensors (e.g. gyroscope) might help to reduce the cumulated errors, but the main problems remain the same.
Robotics ShanghaiTech University - SIST - 08.11.2017 5
Odometry:Growth of Pose uncertainty for Straight Line Movement
• Note: Errors perpendicular to the direction of movement are growing much faster!
Robotics ShanghaiTech University - SIST - 08.11.2017 6
Odometry:Growth of Pose uncertainty for Movement on a Circle
• Note: Errors ellipse in does not remain perpendicular to the direction of movement!
Robotics ShanghaiTech University - SIST - 08.11.2017 7
Odometry:example of non-Gaussian error model
• Note: Errors are not shaped like ellipses!
[Fox, Thrun, Burgard, Dellaert, 2000]
Courtesy AI Lab, Stanford
ShanghaiTech University - SIST - 08.11.2017 8
Odometry: Calibration of Errors• The unidirectional square path experiment
Robotics ShanghaiTech University - SIST - 08.11.2017 9
Belief Representation• How do we represent the robot position,
where the robot “believes” to be?
• a) Continuous mapwith single hypothesis probability distribution
• b) Continuous mapwith multiple hypothesisprobability distribution
• c) Discretized mapwith probability distribution
• d) Discretized topologicalmap with probabilitydistribution
Robotics ShanghaiTech University - SIST - 08.11.2017 10
Single-hypothesis Belief – Continuous Line-Map
Robotics ShanghaiTech University - SIST - 08.11.2017 11
Single-hypothesis Belief – 2D Grid and Topological Map
Robotics ShanghaiTech University - SIST - 08.11.2017 12
Grid-based Representation - Multi HypothesisCourtesy of W. Burgard
ShanghaiTech University - SIST - 08.11.2017 13
Representation of the Environment
• Environment Representation• Continuous Metric ® x, y, q• Discrete Metric ® metric grid• Discrete Topological ® topological grid
• Environment Modeling• Raw sensor data, e.g. laser range data, grayscale images
• large volume of data, low distinctiveness on the level of individual values• makes use of all acquired information
• Low level features, e.g. line other geometric features• medium volume of data, average distinctiveness• filters out the useful information, still ambiguities
• High level features, e.g. doors, a car, the Eiffel tower• low volume of data, high distinctiveness• filters out the useful information, few/no ambiguities, not enough information
Robotics ShanghaiTech University - SIST - 08.11.2017 14
Map Representation:Continuous Line-Baseda) Architecture mapb) Representation with set of finite or infinite lines
Robotics ShanghaiTech University - SIST - 08.11.2017 15
Map Representation: Exact cell decomposition• Exact cell decomposition - Polygons
Robotics ShanghaiTech University - SIST - 08.11.2017 16
Map Representation: Approximate cell decomposition (1)• Fixed cell decomposition
• Narrow passages disappear
Robotics ShanghaiTech University - SIST - 08.11.2017 17
• For example: Quadtree
Map Representation: Adaptive cell decomposition (2)
Robotics ShanghaiTech University - SIST - 08.11.2017 18
Map Representation: Occupancy grid• Fixed cell decomposition: occupancy grid example
• In occupancy grids, each cell may have a counter where 0 indicates that the cell has not been hit by any ranging measurements and therefore it is likely free-space. As the number of ranging strikes increases, the cell value is incremented and, above a certain threshold, the cell is deemed to be an obstacle
• The values of the cells are discounted when a ranging strike travels through the cell. This allows us to represent “transient” (dynamic) obstacles
ShanghaiTech University - SIST - 08.11.2017 19
SLAM: SIMULTANEOUS LOCALIZATION AND MAPPING
"Position" Global Map
Perception Motion Control
Cognition
Real WorldEnvironment
Localization
PathEnvironment ModelLocal Map
Map Building: The Problems1. Map Maintaining: Keeping track of
changes in the environment
e.g. disappearingcupboard
- e.g. measure of belief of each environment feature
2. Representation and Reduction of Uncertainty
position of robot -> position of wall
position of wall -> position of robot
§ probability densities for feature positions§ Inconsistent map due to motion drift
?
Robotics ShanghaiTech University - SIST - 08.11.2017 21
Cyclic Environments• Small local error accumulate to arbitrary large global errors!• This is usually irrelevant for navigation• However, when closing loops, global error does matter
Robotics ShanghaiTech University - SIST - 08.11.2017 22
Raw Odometry …
Courtesy of S. Thrun
Robotics ShanghaiTech University - SIST - 08.11.2017 23
Scan Matching:compare to sensor data from previous scan
Courtesy of S. Thrun
Robotics ShanghaiTech University - SIST - 08.11.2017 24
SLAM overview• Let us assume that the robot uncertainty at its initial location is zero.
• From this position, the robot observes a feature which is mapped with an uncertainty related to the exteroceptivesensor error model
Robotics ShanghaiTech University - SIST - 08.11.2017 25
SLAM overview• As the robot moves, its pose uncertainty increases under the effect of the errors introduced by the odometry
Robotics ShanghaiTech University - SIST - 08.11.2017 26
SLAM overview• At this point, the robot observes two
features and maps them with an uncertainty which results from the combination of the measurement error with the robot pose uncertainty
• From this, we can notice that the map becomes correlated with the robot position estimate. Similarly, if the robot updates its position based on an observation of an imprecisely known feature in the map, the resulting position estimate becomes correlated with the feature location estimate.
Robotics ShanghaiTech University - SIST - 08.11.2017 27
SLAM overview• The robot moves again and its uncertainty increases under the effect of the errors introduced by the odometry
Robotics ShanghaiTech University - SIST - 08.11.2017 28
SLAM overview• In order to reduce its uncertainty, the
robot must observe features whose location is relatively well known. These features can for instance be landmarks that the robot has already observed before.
• In this case, the observation is called loop closure detection.
• When a loop closure is detected, the robot pose uncertainty shrinks.
• At the same time, the map is updated and the uncertainty of other observed features and all previous robot poses also reduce
Robotics ShanghaiTech University - SIST - 08.11.2017 29
The Three SLAM paradigms• Most of the SLAM algorithms are based on the following three different
approaches:• Extended Kalman Filter SLAM: (called EKF SLAM)• Particle Filter SLAM: (called FAST SLAM)• Graph-Based SLAM
Robotics ShanghaiTech University - SIST - 08.11.2017 30
EKF SLAM: overview• extended state vector yt : robot pose xt + position of all the features mi in the
map:
• Example: 2D line-landmarks, size of yt = 3+2n : three variables to represent the robot pose + 2n variables for the n line-landmarks having vector components
• As the robot moves and takes measurements, the state vector and covariance matrix are updated using the standard equations of the extended Kalman filter
• Drawback: EKF SLAM is computationally very expensive.
),( ii raT
nntttt rryxy ],,...,,,,,[ 1100 --= aaq
Robotics
𝑦" = 𝑥",𝑚', … ,𝑚)*+,
ShanghaiTech University - SIST - 08.11.2017 31
• FastSLAM approach• Using particle filters. • Particle filters: mathematical models that represent
probability distribution as a set of discrete particles that occupy the state space.
• Particle filter update• Generate new particle distribution using motion
model and controls a) For each particle:
1. Compare particle’s prediction of measurements with actual measurements2. Particles whose predictions match the measurements are given a high weight
b) Filter resample:• Resample particles based on weight• Filter resample
• Assign each particle a weight depending on how well its estimate of the state agrees with the measurements and randomly draw particles from previous distribution based on weights creating a new distribution.
Particle Filter SLAM: FastSLAM
Robotics ShanghaiTech University - SIST - 08.11.2017 32
Particle Filter SLAM• FastSLAM approach
• Particle set:
Robotics ShanghaiTech University - SIST - 08.11.2017 33
FAST SLAM example
Courtesy of S. Thrun
Robotics ShanghaiTech University - SIST - 08.11.2017 34
FAST SLAM example
Courtesy of S. Thrun
Robotics ShanghaiTech University - SIST - 08.11.2017 35
Graph-Based SLAM (1/3)• SLAM problem can be interpreted as a sparse graph of nodes and constraints between nodes.• The nodes of the graph are the robot locations and the features in the map.• Constraints: relative position between consecutive robot poses , (given by the odometry input u) and the relative
position between the robot locations and the features observed from those locations.
Robotics ShanghaiTech University - SIST - 08.11.2017 36
Graph-Based SLAM (2/3)• Constraints are not rigid but soft constraints!• Relaxation: compute the solution to the full SLAM problem =>
• Compute best estimate of the robot path and the environment map. • Graph-based SLAM represents robot locations and features as the nodes of an elastic net. The SLAM solution can
then be found by computing the state of minimal energy of this net
Robotics ShanghaiTech University - SIST - 08.11.2017 37
Graph-Based SLAM (3/3)• Significant advantage of graph-based SLAM techniques over EKF SLAM:
• EKF SLAM: computation and memory for to update and store the covariance matrix is quadratic with the number of features.
• Graph-based SLAM: update time of the graph is constant and the required memory is linear in the number of features.
• However, the final graph optimization can become computationally costly if the robot path is long.
Robotics ShanghaiTech University - SIST - 08.11.2017 38
Map based localization?
• Odometry, Dead Reckoning• Localization base on external sensors,
beacons or landmarks• Probabilistic Map Based Localization
Observation
Mapdata base
Prediction of Position
(e.g. odometry)
Perc
eptio
n
Matching
Position Update(Estimation?)
raw sensor data or extracted features
predicted position
position
matchedobservations
YES
Encoder
Per
cept
ion
Robotics ShanghaiTech University - SIST - 08.11.2017 39
ICP: Iterative Closest Points Algorithm
• Align two partially-overlapping point sets (2D or 3D)
• Given initial guessfor relative transform
Material derived from Ronen Gvili : www.cs.tau.ac.il/~dcor/Graphics/adv-slides/ICP.ppt
Robotics ShanghaiTech University - SIST - 08.11.2017 40
Data Types• Point sets• Line segment sets (polylines)• Implicit curves : f(x,y,z) = 0• Parametric curves : (x(u),y(u),z(u))• Triangle sets (meshes)• Implicit surfaces : s(x,y,z) = 0• Parametric surfaces (x(u,v),y(u,v),z(u,v)))
Robotics ShanghaiTech University - SIST - 08.11.2017 41
Motivation• Scan Matching -
Registration• Shape inspection• Motion estimation• Appearance
analysis• Texture Mapping• Tracking
Robotics ShanghaiTech University - SIST - 08.11.2017 42
Robotics ShanghaiTech University - SIST - 08.11.2017 43
Robotics ShanghaiTech University - SIST - 08.11.2017 44
Aligning 3D Data• Continuous lines or a set of points…
Robotics ShanghaiTech University - SIST - 08.11.2017 45
Corresponding Point Set Alignment• Let M be a model point set. (or map or previous scan) • Let S be a scene point set. (current scan)
We assume :1. NM = NS.2. Each point Si correspond to Mi .
Robotics ShanghaiTech University - SIST - 08.11.2017 46
Corresponding Point Set Alignment
The Mean Squared Error (MSE) objective function :
The alignment is :
å
å
=
=
--=
--=
S
S
N
iTiRi
S
N
iii
S
qsqRmN
qf
TranssRotmN
TRf
1
2
1
2
)(1)(
)(1),(
),(),,( SMdtransrot mse F=
ShanghaiTech University - SIST - 08.11.2017 47
Aligning 3D Data• If correct correspondences are known, can find correct relative rotation/ translation, e.g. using Horn’s method
Robotics ShanghaiTech University - SIST - 08.11.2017 48
Horn’s method• Input
• Two point sets: 𝑋 and 𝑌• Output
• Rotation matrix 𝑅• Translation vector 𝒕
X Y
𝑅 and 𝒕
Material by Toru Tamaki, Miho Abe, Bisser Raytchev, Kazufumi Kaneda
Robotics ShanghaiTech University - SIST - 08.11.2017 49
Horn’s method: correspondence is known.
𝑋 𝑌
X Y
?
Unknown correspondence
X Y
Known correspondence𝒙+ 𝒚+𝒙4 𝒚4
⋮⋮
𝑇
𝑇
𝑇
𝑇
𝒙+ = (𝑥+8, 𝑥+9, 𝑥+:),
Robotics ShanghaiTech University - SIST - 08.11.2017 50
Horn’s method: correspondence is known.
𝑋 𝑌
𝒙+ 𝒚+𝒙4 𝒚4
⋮⋮
𝑇
𝑇
𝑇
𝑇
𝒙< 𝒚<
Compute centers
𝑋= 𝑌=
Centering
𝑋 − 𝒙< 𝑌 − 𝒚<
𝑋= 𝑌=𝑆 =
𝐾 =
Computer 1st Eigenvector 𝒒: quaternion 𝑞
Convert 𝑞 to 𝑅
𝒕 = 𝒙< − 𝑅𝒚<1 2
3
4
5
Robotics ShanghaiTech University - SIST - 08.11.2017 51
Aligning 3D Data• How to find correspondences: User input? Feature detection? Signatures?
• Alternative: assume closest points correspond
Robotics ShanghaiTech University - SIST - 08.11.2017 52
Aligning 3D Data• How to find correspondences: User input? Feature detection? Signatures?
• Alternative: assume closest points correspond
Robotics ShanghaiTech University - SIST - 08.11.2017 53
Aligning 3D Data• Converges if starting position “close enough“
Robotics ShanghaiTech University - SIST - 08.11.2017 54
Closest Point
• Given 2 points r1 and r2 , the Euclidean distance is:
• Given a point r1 and set of points A , the Euclidean distance is:
221
221
2212121 )()()(),( zzyyxxrrrrd -+-+-=-=
),(min),( 1..11 iniardArd
Î=
ShanghaiTech University - SIST - 08.11.2017 55
Finding Matches
• The scene shape S is aligned to be in the best alignment with the model shape M.
• The distance of each point s of the scene from the model is :
smdMsdMm
-=Îmin),(
ShanghaiTech University - SIST - 08.11.2017 56
Finding Matches
MYMSCY
My
ysdsmdMsdMm
Í=Î
=-=Î
),(
),(min),(
C – the closest point operator
Y – the set of closest points to S
ShanghaiTech University - SIST - 08.11.2017 57
Finding Matches
• Finding each match is performed in O(MN) worst case.
• Given Y we can calculate alignment
• S is updated to be :
),(),,( YSdtransrot F=
transSrotSnew += )(
ShanghaiTech University - SIST - 08.11.2017 58
ICP: correspondence is unknown.
𝑋 𝑌
𝒙+ 𝒚+𝒙4 𝒚4
⋮⋮
𝑇
𝑇
𝑇
𝑇
Find closest(nearest) pointto 𝒙+ in 𝑌
𝑌∗
𝒚D
𝒚D
Put the pointto 𝑌∗
Robotics ShanghaiTech University - SIST - 08.11.2017 59
ICP: correspondence is unknown.
𝑋 𝑌
𝒙+ 𝒚+𝒙4 𝒚4
⋮⋮
𝑇
𝑇
𝑇
𝑇
Find closest(nearest) pointto 𝒙+ in 𝑌
𝑌∗
𝒚E
𝒚D
Put the pointto 𝑌∗
𝒚E
⋮
Horn’s methodwith 𝑋 and 𝑌∗
Estimate 𝑅 and 𝒕
Robotics ShanghaiTech University - SIST - 08.11.2017 60
ICP: correspondence is unknown.
𝑋 𝑅𝑌 + 𝒕
𝒙+ 𝒚+𝒙4 𝒚4
⋮⋮
𝑇
𝑇
𝑇
𝑇
Find closest(nearest) pointto 𝒙+ in 𝑌
𝑌∗
𝒚E
𝒚D
Put the pointto 𝑌∗
𝒚E
⋮
Horn’s methodwith 𝑋 and 𝑌∗
Estimate 𝑅 and 𝒕
Repeat
Robotics ShanghaiTech University - SIST - 08.11.2017 61
The Algorithm
Init the error to ∞
Calculate correspondence
Calculate alignment
Apply alignment
Update error
If error > threshold
Y = CP(M,S),e
(rot,trans,d)
S`= rot(S)+trans
d` = d
Robotics ShanghaiTech University - SIST - 08.11.2017 62
The Algorithmfunction ICP(Scene,Model)beginE` ß + ∞;(Rot,Trans) ß In Initialize-Alignment(Scene,Model);repeat
E ß E`;Aligned-Scene ß Apply-Alignment(Scene,Rot,Trans);Pairs ß Return-Closest-Pairs(Aligned-Scene,Model);(Rot,Trans,E`) ß Update-Alignment(Scene,Model,Pairs,Rot,Trans);
Until |E`- E| < Thresholdreturn (Rot,Trans);end
Robotics ShanghaiTech University - SIST - 08.11.2017 63
Convergence Theorem• The ICP algorithm always converges monotonically to a local minimum with
respect to the MSE distance objective function.
Robotics ShanghaiTech University - SIST - 08.11.2017 64
Time analysisEach iteration includes 3 main steps
A. Finding the closest points : O(NM) per each pointO(NM*NS) total.
B. Calculating the alignment: O(NS)C. Updating the scene: O(NS)
Robotics ShanghaiTech University - SIST - 08.11.2017 65
Optimizing the AlgorithmThe best match/nearest neighbor problem : Given N records each described by K real values (attributes) , and a dissimilarity measure D , find the m records closest to a query record.
Robotics ShanghaiTech University - SIST - 08.11.2017 66
Optimizing the Algorithm• K-D Tree :
Construction time: O(kn log n)Space: O(n)Region Query : O(n1-1/k+k )
Robotics ShanghaiTech University - SIST - 08.11.2017 67
Time analysisEach iteration includes 3 main steps
A. Finding the closest points : O(NM) per each pointO(NMlogNS) total.
B. Calculating the alignment: O(NS)
C. Updating the scene: O(NS)
Robotics ShanghaiTech University - SIST - 08.11.2017 68
ICP Variants
• Variants on the following stages of ICPhave been proposed:
1. Selecting sample points (from one or both point clouds)2. Matching to points to a plane or mesh3. Weighting the correspondences4. Rejecting certain (outlier) point pairs5. Assigning an error metric to the current transform6. Minimizing the error metric w.r.t. transformation
Robotics ShanghaiTech University - SIST - 08.11.2017 69