Foundations of Optimization

IIT Kanpur Course , Prof. Joydeep Dutta

240 students enrolled

Overview

Basic facts about maxima and minima - Examples and modeling - Mathematical Prerequisites - Optimality conditions for Unconstrained Optimization - The Steepest Descent Method - Convergence analysis of Steepest Descent Method - Newtons Method and Convergence Analysis - Quasi Newton Methods - Conjugate Gradient Method - Fundamentals of Constrained Optimization - Minimizing a differentiable function over a convex set - Karush-Kuhn-Tucker Conditions - Active-Set Method - Quadratic Optimization - Penalty Function Method - Penalty Functions and Karush-Kuhn-Tucker Conditions - Sequential Quadratic Programming - Conic Optimization - Semi-definite Programming - Lagrangian Relaxations for Integer Programming - SDP relaxations for quadratic integer programming - The S-Lemma and Quadratic Programming Duality - Duality in optimization - Duality in conic and semidefinite programming - Trust Region Methods - Derivative Free Optimization - Introduction to Calculus of Variations.

Lecture 1:

Up Next
You can skip ad in
SKIP AD >
Advertisement
      • 2x
      • 1.5x
      • 1x
      • 0.5x
      • 0.25x
        EMBED LINK
        COPY
        DIRECT LINK
        PRIVATE CONTENT
        OK
        Enter password to view
        Please enter valid password!
        0:00
        3.3 (25 Ratings)

        Lecture Details

        Foundations of Optimization by Dr. Joydeep Dutta,Department of Mathematics,IIT Kanpur.For more details on NPTEL visit httpnptel.ac.in

        LECTURES



        Review


        3.3

        25 Rates
        5
        44%
        11
        4
        4%
        1
        3
        12%
        3
        2
        16%
        4
        1
        24%
        6

        Comments Added Successfully!
        Please Enter Comments
        Please Enter CAPTCHA
        Invalid CAPTCHA
        Please Login and Submit Your Comment