lightseminar: learned representation in ai an introduction to locally linear embedding lawrence k....
Post on 30-Jan-2016
223 views
TRANSCRIPT
![Page 1: Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee](https://reader036.vdocuments.mx/reader036/viewer/2022081520/56649d7f5503460f94a62307/html5/thumbnails/1.jpg)
Lightseminar: Learned Representation in AI
An Introduction to Locally Linear Embedding
Lawrence K. SaulSam T. Roweis
presented by Chan-Su Lee
![Page 2: Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee](https://reader036.vdocuments.mx/reader036/viewer/2022081520/56649d7f5503460f94a62307/html5/thumbnails/2.jpg)
Introduction● Preprocessing
– Obtain more useful representation of information– Useful for subsequent operation for classification,
pattern recognition● Unsupervised learning
– Automatic methods to recover hidden structure– Density estimation– Dimensionality reduction
![Page 3: Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee](https://reader036.vdocuments.mx/reader036/viewer/2022081520/56649d7f5503460f94a62307/html5/thumbnails/3.jpg)
PCA & MDS● Principle Component Analysis:PCA
– Linear projections of greatest variance from the top eigenvectors of the data covariance matrix
● Multidimensional Scaling:MDS– Low dimensional embedding that best preserves
pairwise distances between data points● Modeling of linear variabilities in high
dimensional data● No local minima● Find linear subspace and cannot deal properly
with data lying on nonlinear manifolds
![Page 4: Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee](https://reader036.vdocuments.mx/reader036/viewer/2022081520/56649d7f5503460f94a62307/html5/thumbnails/4.jpg)
Problem in PCA and MDS
● Create distortion in local and global geometry– Map faraway data point to nearby point
![Page 5: Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee](https://reader036.vdocuments.mx/reader036/viewer/2022081520/56649d7f5503460f94a62307/html5/thumbnails/5.jpg)
LLE(Locally Linear Embedding)
● Property– Preserving the local configurations of nearest neighbors
● LLE– Local: only neighbors contribute to each reconstruction– Linear: reconstructions are confined to linear subspace
● Assumption– Well-sampled data->locally linear patch of the manifold– d-dimensional manifold->2d neighbors
![Page 6: Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee](https://reader036.vdocuments.mx/reader036/viewer/2022081520/56649d7f5503460f94a62307/html5/thumbnails/6.jpg)
![Page 7: Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee](https://reader036.vdocuments.mx/reader036/viewer/2022081520/56649d7f5503460f94a62307/html5/thumbnails/7.jpg)
LLE Algorithm
● Step1
Neighborhood search
– Compute the neighbors of each data point
– K nearest neighbors per data point
![Page 8: Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee](https://reader036.vdocuments.mx/reader036/viewer/2022081520/56649d7f5503460f94a62307/html5/thumbnails/8.jpg)
LLE Algorithm
● Step2
Constrained Least Square Fits
– Reconstruction Error
if not neighbor
-> Invariant to rotations, rescaling, and translation of that point and its neighbor
![Page 9: Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee](https://reader036.vdocuments.mx/reader036/viewer/2022081520/56649d7f5503460f94a62307/html5/thumbnails/9.jpg)
LLE Algorithm
● Step3
Eigenvalue Problem– Reconstruction
Error
– ->
centered at the origin– ->
Avoid degenerate solution
![Page 10: Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee](https://reader036.vdocuments.mx/reader036/viewer/2022081520/56649d7f5503460f94a62307/html5/thumbnails/10.jpg)
Embedding by LLE
Corners faces to the corners of its two dimensional embedding
![Page 11: Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee](https://reader036.vdocuments.mx/reader036/viewer/2022081520/56649d7f5503460f94a62307/html5/thumbnails/11.jpg)
Examples
![Page 12: Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee](https://reader036.vdocuments.mx/reader036/viewer/2022081520/56649d7f5503460f94a62307/html5/thumbnails/12.jpg)
LLE advantages
● Ability to discover nonlinear manifold of arbitrary dimension
● Non-iterative● Global optimality● Few parameters: K, d● O(DN2) and space efficient due to sparse matrix
![Page 13: Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee](https://reader036.vdocuments.mx/reader036/viewer/2022081520/56649d7f5503460f94a62307/html5/thumbnails/13.jpg)
LLE disadvantages
● Requires smooth, non-closed, densely sampled manifold
● Quality of manifold characterization dependent on neighborhood choice
● Cannot estimate low dimensionality● Sensitive to outliers
![Page 14: Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee](https://reader036.vdocuments.mx/reader036/viewer/2022081520/56649d7f5503460f94a62307/html5/thumbnails/14.jpg)
Comparisons: PCA vs LLE vs Isomap
● PCA: find linear subspace projection vectors to the data
● LLE: find embedding coordinate vectors that best fit local neighborhood relationships
● ISOMAP: find embedding coordinate vectors that preserve geodesic shortest distances