x
Menu

Foundations of Optimization

IIT Kanpur, , Prof. Joydeep Dutta

Updated On 02 Feb, 19

Overview

Basic facts about maxima and minima - Examples and modeling - Mathematical Prerequisites - Optimality conditions for Unconstrained Optimization - The Steepest Descent Method - Convergence analysis of Steepest Descent Method - Newtons Method and Convergence Analysis - Quasi Newton Methods - Conjugate Gradient Method - Fundamentals of Constrained Optimization - Minimizing a differentiable function over a convex set - Karush-Kuhn-Tucker Conditions - Active-Set Method - Quadratic Optimization - Penalty Function Method - Penalty Functions and Karush-Kuhn-Tucker Conditions - Sequential Quadratic Programming - Conic Optimization - Semi-definite Programming - Lagrangian Relaxations for Integer Programming - SDP relaxations for quadratic integer programming - The S-Lemma and Quadratic Programming Duality - Duality in optimization - Duality in conic and semidefinite programming - Trust Region Methods - Derivative Free Optimization - Introduction to Calculus of Variations.

Includes

Lecture 8:

4.1 ( 11 )


Lecture Details

Foundations of Optimization by Dr. Joydeep Dutta,Department of Mathematics,IIT Kanpur.For more details on NPTEL visit httpnptel.ac.in

Ratings

0


0 Ratings
55%
30%
10%
3%
2%
Comments
comment person image

Sam

Excellent course helped me understand topic that i couldn't while attendinfg my college.

Reply
comment person image

Dembe

Great course. Thank you very much.

Reply
Send