globally optimal estimates for geometric reconstruction problems tom gilat, adi lakritz advanced...

Download Globally Optimal Estimates for Geometric Reconstruction Problems Tom Gilat, Adi Lakritz Advanced Topics in Computer Vision Seminar Faculty of Mathematics

If you can't read please download the document

Upload: shawna-becknell

Post on 15-Dec-2015

221 views

Category:

Documents


3 download

TRANSCRIPT

  • Slide 1

Globally Optimal Estimates for Geometric Reconstruction Problems Tom Gilat, Adi Lakritz Advanced Topics in Computer Vision Seminar Faculty of Mathematics and Computer Science Weizmann Institute 3 June 2007 Slide 2 outline Motivation and Introduction Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP) Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation Application in vision Finding optimal structure Partial relaxation and Schur s complement Slide 3 Motivation Geometric Reconstruction Problems Polynomial optimization problems (POPs) Slide 4 Triangulation problem in L 2 norm Multi view - optimization2-views exact solution Slide 5 Triangulation problem in L 2 norm perspective camera i x y z err Slide 6 Triangulation problem in L 2 norm minimize reprojection error in all cameras Polynomial minimization problem Non convex Slide 7 More computer vision problems Reconstruction problem: known cameras, known corresponding points find 3D points that minimize the projection error of given image points Similar to triangulation for many points and cameras Calculating homography given 3D points on a plane and corresponding image points, calculate homography Many more problems Slide 8 Optimization problems Slide 9 Introduction to optimization problems Slide 10 optimization problems NP - complete Slide 11 optimization problems optimization convexnon convex Linear Programming (LP) SemiDefinite Programming (SDP) solutions exist: interior point methods problems: local optimum or high computational cost Slide 12 non convex optimization init level curves of f Many algorithms Get stuck in local minima MinMax Non convex feasible set Slide 13 optimization problems optimization convexnon convex LPSDP solutions exist: interior point methods problems: local optimum or high computational cost global optimization algorithms that converge to optimal solution relaxation of problem Slide 14 outline Motivation and Introduction Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP) Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation Application in vision Finding optimal structure Partial relaxation and Schur s complement Slide 15 positive semidefinite (PSD) matrices Definition: a matrix M in R nn is PSD if 1. M is symmetric: M=M T 2. for all M can be decomposed as AA T (Cholesky) Proof: Slide 16 positive semidefinite (PSD) matrices Definition: a matrix M in R nn is PSD if 1. M is symmetric: M=M T 2. for all M can be decomposed as AA T (Cholesky) Slide 17 principal minors The kth order principal minors of an nn symmetric matrix M are the determinants of the kk matrices obtained by deleting n - k rows and the corresponding n - k columns of M first order: elements on diagonal second order: Slide 18 diagonal minors The kth order principal minors of an nn symmetric matrix M are the determinants of the kk matrices obtained by deleting n - k rows and the corresponding n - k columns of M first order: elements on diagonal second order: Slide 19 diagonal minors The kth order principal minors of an nn symmetric matrix M are the determinants of the kk matrices obtained by deleting n - k rows and the corresponding n - k columns of M first order: elements on diagonal second order: Slide 20 diagonal minors The kth order principal minors of an nn symmetric matrix M are the determinants of the kk matrices obtained by deleting n - k rows and the corresponding n - k columns of M first order: elements on diagonal third order: det(M) second order: Slide 21 Set of PSD matrices in 2D Slide 22 Set of PSD matrices This set is convex Proof: Slide 23 LMI linear matrix inequality Slide 24 LMI example: find the feasible set of the 2D LMI Slide 25 reminder Slide 26 LMI example: find the feasible set of the 2D LMI 1st order principal minors Slide 27 LMI example: find the feasible set of the 2D LMI 2nd order principal minors Slide 28 LMI example: find the feasible set of the 2D LMI 3rd order principal minors Intersection of all inequalities Slide 29 Semidefinite Programming (SDP) = LMI an extension of LP Slide 30 outline Motivation and Introduction Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP) Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation Application in vision Finding optimal structure Partial relaxation and Schur s complement Slide 31 Sum Of Squares relaxation (SOS) Unconstrained polynomial optimization problem (POP) means the feasible set is R n H. Waki, S. Kim, M. Kojima, and M. Muramatsu. SOS and SDP relaxations for POPs with structured sparsity. SIAM J. Optimization, 2006. Slide 32 Sum Of Squares relaxation (SOS) Slide 33 SOS relaxation for unconstrained polynomials Slide 34 Slide 35 monomial basis example Slide 36 SOS relaxation to SDP SDP Slide 37 SOS relaxation to SDP example: SDP Slide 38 SOS for constrained POPs possible to extend this method for constrained POPs by use of generalized Lagrange dual Slide 39 SOS relaxation summary So we know how to solve a POP that is a SOS And we have a bound on a POP that is not an SOS H. Waki, S. Kim, M. Kojima, and M. Muramatsu. SOS and SDP relaxations for POPs with structured sparsity. SIAM J. Optimization, 2006. POP SOS problem SOS relaxation Global estimate SDP Slide 40 relaxations SOS: POP SOS problem SOS relaxation Global estimate SDP POP linear & LMI problem LMI relaxation Global estimate SDP + converge LMI: Slide 41 outline Motivation and Introduction Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP) Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation Application in vision Finding optimal structure Partial relaxation and Schur s complement Slide 42 LMI relaxations Constraints are handled Convergence to optimum is guaranteed Applies to all polynomials, not SOS as well Slide 43 A maximization problem Note that: a.Feasible set is non-convex. b.Constraints are quadratic Feasible set Slide 44 LMI linear matrix inequality, a reminder Slide 45 Goal An SDP: Motivation Polynomial Optimization Problem SDP with solution close to global optimum of the original problem What is it good for? SDP problems can be solved much more efficiently then general optimization problems. Slide 46 POP Linear + LMI + rank constraints SDP LMI: LMI Relaxations is iterative process Step 1: introduce new variables Step 2: relax constraints Apply higher order relaxations Slide 47 LMI relaxation step 1 Replace monomials by lifting variables Rule: Example: (the R 2 case) Slide 48 Introducing lifting variables Lifting Slide 49 Not equivalent to the original problem. Lifting variables are not independent in the original problem: New problem is linear, in particular convex Slide 50 Goal, more specifically Linear problem (obtained by lifing) relations constraints on lifting variables + SDP Relaxation Slide 51 Question: how do we guarantee that the relations between lifting variables hold? Slide 52 LMI relaxation step 2 Apply lifting and get: Take the basis of the degree 1 polynomials. Note that: Because: Slide 53 If the relations constraints hold then This is because we can decompose M as follows: Assuming relations hold Rank M = 1 Slide 54 Weve seen: Relations constraints hold What about the opposite: Relations constraints hold This is true as well Slide 55 By the following: LMI relaxation step 2, continued Relations constraints hold All relations equalities are in the set of equalities Slide 56 Conclusion of the analysis The y feasible set Subset of feasible set with Relations constraints hold Slide 57 Original problem is equivalent to the following: together with the additional constraint Relaxation, at last We denote moment matrix of order 1 Relax by dropping the non-convex constraint Slide 58 LMI relaxation of order 1 Feasible set Slide 59 Rank constrained LMI vs. unconstrained Slide 60 LMI relaxations of higher order It turns out that we can do better: Apply LMI relaxations of higher order A tighter SDP Relaxations of higher order incorporate the inequality constraints in LMI We show relaxation of order 2 It is possible to continue and apply relaxations Theory guarantees convergence to global optimum Slide 61 Let be a basis of polynomials of degree 2. Again, Lifting gives: Again, we will relax by dropping the rank constraint. LMI relaxations of second order Slide 62 Inequality constraints to LMI Replace our constraints by LMIs and have a tighter relaxation. Linear Constraint : Lifting LMI Constraint : For example, Slide 63 This procedure brings a new SDP LMI relaxations of order 2 Second SDP feasible set is included in the first SDP feasible set Silimarly, we can continue and apply higher order relaxations. Slide 64 If the feasible set defined by constraints is compact, then under mild additional assumptions, Lassere proved in 2001 that there is an asymptotic convergence guarantee: is the solution to kth relaxation is the solution for the original problem (finding a maximum) Theoretical basis for the LMI relaxations Moreover, convergence is fast: is very close to for small k Lasserre J.B. (2001) "Global optimization with polynomials and the problem of moments" SIAM J. Optimization 11, pp 796--817. Slide 65 The method provides a certificate of global optimality: An important experimental observation: SDP solution is global optimum Minimizing Low rank moment matrix Checking global optimality Slide 66 We add to the objective function the trace of the moment matrix weighted by a sufficiently small positive scalar Slide 67 LMI relaxations in vision Application Slide 68 outline Motivation and Introduction Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP) Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation Application in vision Finding optimal structure Partial relaxation and Schur s complement Slide 69 Camera center Image plane Finding optimal structure A perspective camera P is the camera matrix and is the depth. Measured image points are corrupted by independent Gaussian noise. We want to minimize the least squares errors between measured and projected points. The relation between a U in the 3D space and u in the image plane is given by: Slide 70 We therefore have the following optimization problem: is the Euclidean distance. is the set of unknowns Where are polynomials. Each term in the cost function can be written as: Our objective is therefore to minimize a sum of rational functions. Slide 71 How can we turn the optimization problem into a polynomial optimization problem? Suppose that each term in has an upper bound, then Then our optimization problem is equivalent to the following: This is a polynomial optimization problem, for which we apply LMI relaxations. Note that we introduced many new variables one for each term. Slide 72 Problem: an SDP with a large number of variables can be computationally demanding. A large number of variables can arise from: LMI relaxations of high order Introduction of new variables as weve seen This is where partial relaxations come in. For that we introduce Schurs complement. Partial Relaxations Slide 73 Schurs comlement Set: Slide 74 Schurs comlement - applying Derivation of right side: C - B T * A -1 * B > 0 Slide 75 Schurs complement allows us to state our optimization problem as follows: Partial relaxations The only non-linearity is due to We can apply LMI relaxations only on and leave If we were to apply full relaxations for all variables, the problem would become Intractable for small N. Slide 76 Partial relaxations Disadvantage of partial relaxations: we are not able to ensure asymptotic convergence to the global optimum. However, we have a numerical certificate of global optimality just as in the case of full relaxations: The moment matrix of the relaxed variables is of rank one Solution of partially relaxed problem is the global optimum Slide 77 Full relaxation vs. partial relaxtion Application: Triangulation, 3 cameras Goal: find the optimal 3D point. Camera matrices are known, measured point is assumed to be in the origin of each view. Camera matrices: Slide 78 Summary Geometric Vision Problems to POPs Triangulation and reconstruction problem Relaxations of POPs Sum Of Squares (SOS) relaxation Guarantees bound on optimal solution Usually solution is optimal Linear Matrix Inequalities (LMI) relaxation First order LMI relaxation: lifting, dropping rank constraint Higher order LMI relaxation: linear constraints to LMIs Guarantee of convergence, reference to Lassere Certificate of global optimality Application in vision Finding optimal structure Partial relaxation and Schurs complement Triangulation problem, benefit of partial relaxations Slide 79 References F. Kahl and D. Henrion. Globally Optimal Estimates for Geometric Reconstruction Problems. Accepted IJCV H. Waki, S. Kim, M. Kojima, and M. Muramatsu. Sums of squares and semidefinite programming relaxations for polynomial optimization problems with structured sparsity. SIAM J. Optimization, 17(1):218 242, 2006. J. B. Lasserre. Global optimization with polynomials and the problem of moments. SIAM J. Optimization, 11:796 817, 2001. S. Boyd and L. Vandenberghe. Convex Optimization. Cambridge University Press, 2004. R. I. Hartley and A. Zisserman. Multiple View Geometry in Computer Vision. Cambridge University Press, 2004. Second Edition.