Carnegie Mellon University Course , Fall 2018 , Prof. Ryan Tibshirani

**417**students enrolled

Carnegie Mellon University Course , Fall 2018 , Prof. Ryan Tibshirani

Nearly every problem in machine learning and computational statistics can be formulated

in terms of the optimization of some function, possibly under some set of constraints. As

we obviously cannot solve every problem in machine learning, this means that we cannot

generically solve every optimization problem (at least not efficiently). Fortunately, many

problems of interest in machine learning can be posed as optimization tasks that have

special propertiessuch as convexity, smoothness, sparsity, separability, etc.permitting

standardized, efficient solution techniques.

in terms of the optimization of some function, possibly under some set of constraints. As

we obviously cannot solve every problem in machine learning, this means that we cannot

generically solve every optimization problem (at least not efficiently). Fortunately, many

problems of interest in machine learning can be posed as optimization tasks that have

special propertiessuch as convexity, smoothness, sparsity, separability, etc.permitting

standardized, efficient solution techniques.

This course is designed to give a graduate-level student a thorough grounding in these

properties and their role in optimization, and a broad comprehension of algorithms tailored

to exploit such properties. The focus will be on convex optimization problems (though

we also may touch upon nonconvex optimization problems at some points). We will visit

and revisit important applications in machine learning and statistics.

properties and their role in optimization, and a broad comprehension of algorithms tailored

to exploit such properties. The focus will be on convex optimization problems (though

we also may touch upon nonconvex optimization problems at some points). We will visit

and revisit important applications in machine learning and statistics.

Upon completing the

course, students should be able to approach an optimization problem (often derived from a

machine learning or statistics context) and:

course, students should be able to approach an optimization problem (often derived from a

machine learning or statistics context) and:

- 1. identify key properties such as convexity, smoothness, sparsity, etc., and/or possibly

reformulate the problem so that it possesses such desirable properties; - 2. select an algorithm for this optimization problem, with an understanding of the advantages and disadvantages of applying one method over another, given the problem

and properties at hand; - 3. implement this algorithm or use existing software to efficiently compute the solution.

Up Next

You can skip ad in

SKIP AD >

Advertisement

- 2x
- 1.5x
- 1x
- 0.5x
- 0.25x

EMBED LINK

COPY

DIRECT LINK

COPY

PRIVATE CONTENT

OK

Enter password to view

Please enter valid password!

- Play Pause
- Mute UnMute
- Fullscreen Normal
- @Your Company Title

0:00

0 (0 Ratings)

0%

0%

0%

0%

0%

- 1.Lecture 01 Optimization in Machine Learning and Statistics
- 2.Lecture 02 Convexity I Sets and Functions
- 3.Lecture 03 Convexity II Optimization Basics
- 4.Lecture 04 Convex Optimization
- 5.Lecture 05 Convex Optimization
- 6.Lecture 06 Convex Optimization
- 7.Lecture 07 Convex Optimization
- 8.Lecture 08 Proximal Gradient Descent And Acceleration
- 9.Lecture 09 Convex Optimization
- 10.Lecture 10 Convex Optimization
- 11.Lecture 11 Convex Optimization
- 12.Lecture 12 Convex Optimization Karush Kuhn Tucker Conditions
- 13.Lecture 13 Convex Optimization Daily Uses and Correspondences
- 14.Lecture 14 Convex Optimization Newton's Method
- 15.Lecture 15 Convex Optimization Barrier Method
- 16.Lecture 16 Convex Optimization
- 17.Lecture 17 Convex Optimization Quasi Newton Methods
- 18.Lecture 18 Proximal Newton Method
- 19.Lecture 19 Numerical Linear Algebra Primer
- 20.Lecture 20 Coordinate Descent
- 21.Lecture 21 Dual Ascent
- 22.Lecture 22 Alternating Direction Method of Multipliers
- 23.Lecture 23 Frank Wolfe Method
- 24.Lecture 24 Modern Stochastic Methods
- 25.Lecture 25 Mirror Descent
- 26.Lecture 26 Semester Review

- FreeVideoLectures aim to help millions of students across the world acquire knowledge, gain good grades, get jobs, assist in getting promotions through quality learning material.

- You can write to us
- help@freevideolectures.com

2018 FreeVideoLectures. All rights reserved. FreeVideoLectures only promotes free course material from different sources, we are not endrosed by any university.