Foundations of Optimization

0( 0 REVIEWS )
2 STUDENTS

Basic facts about maxima and minima – Examples and modeling – Mathematical Prerequisites – Optimality conditions for Unconstrained Optimization – The Steepest Descent Method – Convergence analysis of Steepest Descent Method – Newtons Method and Convergence Analysis – Quasi Newton Methods – Conjugate Gradient Method – Fundamentals of Constrained Optimization – Minimizing a differentiable function over a convex set – Karush-Kuhn-Tucker Conditions – Active-Set Method – Quadratic Optimization – Penalty Function Method – Penalty Functions and Karush-Kuhn-Tucker Conditions – Sequential Quadratic Programming – Conic Optimization – Semi-definite Programming – Lagrangian Relaxations for Integer Programming – SDP relaxations for quadratic integer programming – The S-Lemma and Quadratic Programming Duality – Duality in optimization – Duality in conic and semidefinite programming – Trust Region Methods – Derivative Free Optimization – Introduction to Calculus of Variations.

Course Curriculum

Course Reviews

N.A

ratings
  • 5 stars0
  • 4 stars0
  • 3 stars0
  • 2 stars0
  • 1 stars0

No Reviews found for this course.

FreeVideoLectures.com All rights reserved.

Setup Menus in Admin Panel