approximate nearest subspace search with applications to pattern recognition
DESCRIPTION
Approximate Nearest Subspace Search with applications to pattern recognition. Ronen Basri Tal Hassner Lihi Zelnik-Manor Weizmann Institute Caltech. Basri & Jacobs, PAMI’03. Nayar et al., IUW’96. Subspaces in Computer Vision. Illumination. Faces. - PowerPoint PPT PresentationTRANSCRIPT
Approximate Nearest Subspace Searchwith applications to pattern recognition
Ronen Basri Tal Hassner Lihi Zelnik-ManorWeizmann Institute Caltech
Subspaces in Computer Vision
Zelnik-Manor & Irani, PAMI’06
Basri & Jacobs, PAMI’03
QuickTime™ and aTIFF (LZW) decompressor
are needed to see this picture.
Nayar et al., IUW’96
•Illumination
•Faces
•Objects
•Viewpoint, Motion
•Dynamic textures•…
Sequential Search
Sequential search: O(ndk)
Too slow!!
Is there a sublinear solution?
Database
d dimensions
n subspaces
k subspace dimension
A Related Problem:Nearest Neighbor Search
d dimensions
n points
Sequential search: O(nd)
There is a sublinear solution!
Database
Approximate NN
(1+)r
• Tree search (KD-trees)
• Locality Sensitive Hashing
Fast!!
Query: Logarithmic Preprocessing: O(dn)
r
Is it possible to speed-up Nearest Subspace Search?
Existing point-based methods cannot be applied
Tree searchLSH
Our Suggested Approach• Reduction to points
• Works for bothlinear and affine spaces
Ru
n ti
me
Sequential Our
Database size
Problem Definition
€
S = Subspace with dim k
q = Query
Find Mapping
€
u = f (S)
v = g(q)
€
u − v2
= μ dist 2(q,S) + ω
Apply standard point ANN to u,v
A linear function of original distance
Monotonic in distance
Independent mappings
Finding a Reduction
€
dist 2(q,S) = SSTq − q2
€
=Vec (SST − I) • Vec (qqT )
€
u
€
v
€
u − v2
= u2
+ v2
+ 2dist 2 q,S( )
Constants?
€
u2
= d − k
€
v2
= q4
Depends on query
€
q
€
S
€
SSTq
Feeling lucky?
We are lucky !!
Geometry of Basic Reduction
Database
€
u2
= d − k Lies on a sphere
and on a hyper-plane
QueryLies on a cone
€
v2
= q4
€
u = Vec (SST − I)
v = Vec (qqT )
Can We Do Better?
€
u − v2
= 0
€
dist 2(q,S) = 0€
q
If =0
€
u = v
Trivial mapping Additive Constant is Inherent
Dimensionality May be Large
• Embedding in d2
• Might need to use small ε
• Current solution:–Use random projections (use Johnson-Lindenstrauss Lemma)–Repeat several times and select the nearest
Synthetic Data
Varying database size
d=60, k=4
Ru
n ti
me
Sequential Our
Database size
Varying dimension
n=5000, k=4
Ru
n ti
me
Sequential Our
dimension
Summary• Fast, approximate nearest subspace search• Reduction to point ANN• Useful applications in computer vision• Disadvantages:
– Embedding in d2
– Additive constant • Other methods? • Additional applications?
A lot more to be done…..