survey of unconstrained optimization gradient based algorithms unconstrained minimization steepest...

5
Survey of unconstrained optimization gradient based algorithms Unconstrained minimization Steepest descent vs. conjugate gradients Newton and quasi-Newton methods Matlab fminunc

Upload: melvin-caldwell

Post on 24-Dec-2015

234 views

Category:

Documents


0 download

TRANSCRIPT

  • Slide 1
  • Slide 2
  • Survey of unconstrained optimization gradient based algorithms Unconstrained minimization Steepest descent vs. conjugate gradients Newton and quasi-Newton methods Matlab fminunc
  • Slide 3
  • Unconstrained local minimization The necessity for one dimensional searches The most intuitive choice of s k is the direction of steepest descent This choice, however is very poor Methods are based on the dictum that all functions of interest are locally quadratic
  • Slide 4
  • Conjugate gradients
  • Slide 5
  • Newton and quasi-Newton methods Newton Quasi-Newton methods use successive evaluations of gradients to obtain approximation to Hessian or its inverse Matlabs fminunc uses a variant of Newton if gradient routine is provided, otherwise BFGS quasi-Newton. The variant of Newton is called trust region approach and is based on using a quadratic approximation of the function inside a box.
  • Slide 6
  • Problems Unconstrained algorithms Explain the differences and commonalities of steepest descent, conjugate gradients, Newtons method, and quasi-Newton methods for unconstrained minimization. Solution on Notes page. Use fminunc to minimize the Rosenbrock Banana function and compare the trajectories of fminsearch and fminunc starting from (-1.2,1), with and without the routine for calculating the gradient. Plot the three trajectories. SolutionSolution