x
Menu

Pattern Recognition

IISc Bangalore, , Prof. P.S. Sastry

Updated On 02 Feb, 19

Overview

Contents:
Overview of Pattern classification and regression : Introduction to Statistical Pattern Recognition - Overview of Pattern Classifiers
Bayesian decision making and Bayes Classifier : The Bayes Classifier for minimizing Risk - Estimating Bayes Error; Minimax and Neymann-Pearson classifiers
Parametric Estimation of Densities : Implementing Bayes Classifier; Estimation of Class Conditional Densities - Maximum Likelihood estimation of different densities - Bayesian estimation of parameters of density functions, MAP estimates - Bayesian Estimation examples; the exponential family of densities and ML estimates - Sufficient Statistics; Recursive formulation of ML and Bayesian estimates
Mixture Densities and EM Algorithm : Mixture Densities, ML estimation and EM algorithm - Convergence of EM algorithm; overview of Nonparametric density estimation

Nonparametric density estimation : Convergence of EM algorithm; overview of Nonparametric density estimation - Nonparametric estimation, Parzen Windows, nearest neighbour methods
Linear models for classification and regression : Linear Discriminant Functions; Perceptron -- Learning Algorithm and convergence proof - Linear Least Squares Regression; LMS algorithm - AdaLinE and LMS algorithm; General nonliner least-squares regression - Logistic Regression; Statistics of least squares method; Regularized Least Squares - Fisher Linear Discriminant - Linear Discriminant functions for multi-class case; multi-class logistic regression
Overview of statistical learning theory, Empirical Risk Minimization and VC-Dimension : Learning and Generalization; PAC learning framework - Overview of Statistical Learning Theory; Empirical Risk Minimization - Consistency of Empirical Risk Minimization - Consistency of Empirical Risk Minimization; VC-Dimension - Complexity of Learning problems and VC-Dimension - VC-Dimension Examples; VC-Dimension of hyperplanes

Artificial Neural Networks for Classification and regression : Overview of Artificial Neural Networks - Multilayer Feedforward Neural networks with Sigmoidal activation functions; - Backpropagation Algorithm; Representational abilities of feedforward networks - Feedforward networks for Classification and Regression; Backpropagation in Practice - Radial Basis Function Networks; Gaussian RBF networks - Learning Weights in RBF networks; K-means clustering algorithm
Support Vector Machines and Kernel based methods : Support Vector Machines -- Introduction, obtaining the optimal hyperplane - SVM formulation with slack variables; nonlinear SVM classifiers - Kernel Functions for nonlinear SVMs; Mercer and positive definite Kernels - Support Vector Regression and -insensitive Loss function, examples of SVM learning - Overview of SMO and other algorithms for SVM; -SVM and -SVR; SVM as a risk minimizer - Positive Definite Kernels; RKHS; Representer Theorem

Feature Selection, Model assessment and cross-validation : Feature Selection and Dimensionality Reduction; Principal Component Analysis - No Free Lunch Theorem; Model selection and model estimation; Bias-variance trade-off - Assessing Learnt classifiers; Cross Validation;
Boosting and Classifier ensembles : Bootstrap, Bagging and Boosting; Classifier Ensembles; AdaBoost - Risk minimization view of AdaBoost

Includes

Lecture 39: Bootstrap, Bagging and Boosting; Classifier Ensembles; AdaBoost

4.1 ( 11 )


Lecture Details

Pattern Recognition by Prof. P.S. Sastry, Department of Electronics & Communication Engineering, IISc Bangalore. For more details on NPTEL visit httpnptel.ac.in

Ratings

0


0 Ratings
55%
30%
10%
3%
2%
Comments
comment person image

Sam

Excellent course helped me understand topic that i couldn't while attendinfg my college.

Reply
comment person image

Dembe

Great course. Thank you very much.

Reply
Send