# Foundations of Optimization

IIT Kanpur, , Prof. Joydeep Dutta

Updated On 02 Feb, 19

IIT Kanpur, , Prof. Joydeep Dutta

Updated On 02 Feb, 19

Basic facts about maxima and minima - Examples and modeling - Mathematical Prerequisites - Optimality conditions for Unconstrained Optimization - The Steepest Descent Method - Convergence analysis of Steepest Descent Method - Newtons Method and Convergence Analysis - Quasi Newton Methods - Conjugate Gradient Method - Fundamentals of Constrained Optimization - Minimizing a differentiable function over a convex set - Karush-Kuhn-Tucker Conditions - Active-Set Method - Quadratic Optimization - Penalty Function Method - Penalty Functions and Karush-Kuhn-Tucker Conditions - Sequential Quadratic Programming - Conic Optimization - Semi-definite Programming - Lagrangian Relaxations for Integer Programming - SDP relaxations for quadratic integer programming - The S-Lemma and Quadratic Programming Duality - Duality in optimization - Duality in conic and semidefinite programming - Trust Region Methods - Derivative Free Optimization - Introduction to Calculus of Variations.

- On-demand Videos
- Login & Track your progress
- Full Lifetime acesses

4.1 ( 11 )

Foundations of Optimization by Dr. Joydeep Dutta,Department of Mathematics,IIT Kanpur.For more details on NPTEL visit httpnptel.ac.in

Sam

Sep 12, 2018

Excellent course helped me understand topic that i couldn't while attendinfg my college.

Dembe

March 29, 2019

Great course. Thank you very much.