x
Menu

Introduction to Machine Learning

Carnegie Mellon University, , Prof. Pat Virtue

Updated On 02 Feb, 19

Overview

Machine Learning is concerned with computer programs that automatically improve their performance through experience (e.g., programs that learn to recognize human faces, recommend music and movies, and drive autonomous robots). This course covers the core concepts, theory, algorithms and applications of machine learning. We cover supervised learning topics such as classification (Naive Bayes, Logistic regression, Support Vector Machines, neural networks, k-NN, decision trees, boosting) and regression (linear, nonlinear, kernel, nonparametric), as well as unsupervised learning (density estimation, clustering, PCA, dimensionality reduction)

Includes

Lecture 25: Deep RL + K Means - Introduction to Machine Learning

4.1 ( 11 )


Lecture Details

Full Playlist: https://www.youtube.com/playlist?list=PLpqQKYIU-snAPM89YPPwyQ9xdaiAdoouk Course Link: http://www.cs.cmu.edu/~mgormley/courses/10601-s20/
Schedule: http://www.cs.cmu.edu/~mgormley/courses/10601-s20/schedule.html
White Board/Lecture Notes: https://onedrive.live.com/view.aspx?resid=2A78C342EA463DA9!881&authkey=!ABXJKwZXCIDAwjo
Slides: http://www.cs.cmu.edu/~mgormley/courses/10601-s20/slides/
Previous Versions of this course: http://www.cs.cmu.edu/~mgormley/courses/10601/previous.html

Course Overview:
Machine Learning is concerned with computer programs that automatically improve their performance through experience (e.g., programs that learn to recognize human faces, recommend music and movies, and drive autonomous robots). This course covers the theory and practical algorithms for machine learning from a variety of perspectives. We cover topics such as Bayesian networks, decision tree learning, Support Vector Machines, statistical learning methods, unsupervised learning, and reinforcement learning. The course covers theoretical concepts such as inductive bias, the PAC learning framework, Bayesian learning methods, margin-based learning, and Occam’s Razor. Programming assignments include hands-on experiments with various learning algorithms. This course is designed to give a graduate-level student a thorough grounding in the methodologies, technologies, mathematics, and algorithms currently needed by people who do research in machine learning.

10-301 and 10-601 are identical. Undergraduates must register for 10-301 and graduate students must register for 10-601.

Learning Outcomes: By the end of the course, students should be able to:

- Implement and analyze existing learning algorithms, including well-studied methods for classification, regression, structured prediction, clustering, and representation learning
- Integrate multiple facets of practical machine learning in a single system: data preprocessing, learning, regularization and model selection
- Describe the formal properties of models and algorithms for learning and explain the practical implications of those results
- Compare and contrast different paradigms for learning (supervised, unsupervised, etc.)
- Design experiments to evaluate and compare different machine learning techniques on real-world problems
- Employ probability, statistics, calculus, linear algebra, and optimization in order to develop new predictive models or learning methods
- Given a description of an ML technique, analyze it to identify:
(1) the expressive power of the formalism;
(2) the inductive bias implicit in the algorithm;
(3) the size and complexity of the search space;
(4) the computational properties of the algorithm:
(5) any guarantees (or lack thereof) regarding termination, convergence, correctness, accuracy, or generalization power.

Ratings

0


0 Ratings
55%
30%
10%
3%
2%
Comments
comment person image

Sam

Excellent course helped me understand topic that i couldn't while attendinfg my college.

Reply
comment person image

Dembe

Great course. Thank you very much.

Reply
Send