Pattern Recognition

0( 0 REVIEWS )
1 STUDENTS

Contents:
Overview of Pattern classification and regression : Introduction to Statistical Pattern Recognition – Overview of Pattern Classifiers
Bayesian decision making and Bayes Classifier : The Bayes Classifier for minimizing Risk – Estimating Bayes Error; Minimax and Neymann-Pearson classifiers
Parametric Estimation of Densities : Implementing Bayes Classifier; Estimation of Class Conditional Densities – Maximum Likelihood estimation of different densities – Bayesian estimation of parameters of density functions, MAP estimates – Bayesian Estimation examples; the exponential family of densities and ML estimates – Sufficient Statistics; Recursive formulation of ML and Bayesian estimates
Mixture Densities and EM Algorithm : Mixture Densities, ML estimation and EM algorithm – Convergence of EM algorithm; overview of Nonparametric density estimation

Nonparametric density estimation : Convergence of EM algorithm; overview of Nonparametric density estimation – Nonparametric estimation, Parzen Windows, nearest neighbour methods
Linear models for classification and regression : Linear Discriminant Functions; Perceptron — Learning Algorithm and convergence proof – Linear Least Squares Regression; LMS algorithm – AdaLinE and LMS algorithm; General nonliner least-squares regression – Logistic Regression; Statistics of least squares method; Regularized Least Squares – Fisher Linear Discriminant – Linear Discriminant functions for multi-class case; multi-class logistic regression
Overview of statistical learning theory, Empirical Risk Minimization and VC-Dimension : Learning and Generalization; PAC learning framework – Overview of Statistical Learning Theory; Empirical Risk Minimization – Consistency of Empirical Risk Minimization – Consistency of Empirical Risk Minimization; VC-Dimension – Complexity of Learning problems and VC-Dimension – VC-Dimension Examples; VC-Dimension of hyperplanes

Artificial Neural Networks for Classification and regression : Overview of Artificial Neural Networks – Multilayer Feedforward Neural networks with Sigmoidal activation functions; – Backpropagation Algorithm; Representational abilities of feedforward networks – Feedforward networks for Classification and Regression; Backpropagation in Practice – Radial Basis Function Networks; Gaussian RBF networks – Learning Weights in RBF networks; K-means clustering algorithm
Support Vector Machines and Kernel based methods : Support Vector Machines — Introduction, obtaining the optimal hyperplane – SVM formulation with slack variables; nonlinear SVM classifiers – Kernel Functions for nonlinear SVMs; Mercer and positive definite Kernels – Support Vector Regression and ε-insensitive Loss function, examples of SVM learning – Overview of SMO and other algorithms for SVM; ν-SVM and ν-SVR; SVM as a risk minimizer – Positive Definite Kernels; RKHS; Representer Theorem

Feature Selection, Model assessment and cross-validation : Feature Selection and Dimensionality Reduction; Principal Component Analysis – No Free Lunch Theorem; Model selection and model estimation; Bias-variance trade-off – Assessing Learnt classifiers; Cross Validation;
Boosting and Classifier ensembles : Bootstrap, Bagging and Boosting; Classifier Ensembles; AdaBoost – Risk minimization view of AdaBoost

Course Curriculum

Introduction to Statistical Pattern Recognition Details 0:55
Overview of Pattern Classifiers Details 55:39
The Bayes Classifier for minimizing Risk Details 56:41
Estimating Bayes Error; Minimax and Neymann-Pearson classifiers Details 57:16
Implementing Bayes Classifier; Estimation of Class Conditional Densities Details 58:8
Maximum Likelihood estimation of different densities Details 58:16
Bayesian estimation of parameters of density functions, MAP estimates Details 57:6
Sufficient Statistics; Recursive formulation of ML and Bayesian estimates Details 58:7
Mixture Densities, ML estimation and EM algorithm Details 57:27
Mod-04 & 05 Lec-11 Convergence of EM algorithm; overview of Nonparametric density estimation Details 58:18
Nonparametric estimation, Parzen Windows, nearest neighbour methods Details 57:30
Linear Discriminant Functions; Perceptron — Learning Algorithm and convergence proof Details 58:22
Linear Least Squares Regression; LMS algorithm Details 58:16
AdaLinE and LMS algorithm; General nonliner least-squares regression Details 58:18
Logistic Regression; Statistics of least squares method; Regularized Least Squares Details 58:23
Fisher Linear Discriminant Details 58:12
Linear Discriminant functions for multi-class case; multi-class logistic regression Details 57:24
Learning and Generalization; PAC learning framework Details 59:2
Overview of Statistical Learning Theory; Empirical Risk Minimization Details 58:53
Consistency of Empirical Risk Minimization Details 58:35
Consistency of Empirical Risk Minimization; VC-Dimension Details 58:14
Complexity of Learning problems and VC-Dimension Details 58:38
VC-Dimension Examples; VC-Dimension of hyperplanes Details 0:59
Overview of Artificial Neural Networks Details 59:11
Multilayer Feedforward Neural networks with Sigmoidal activation functions; Details 58:57
Backpropagation Algorithm; Representational abilities of feedforward networks Details 59:1
Feedforward networks for Classification and Regression; Backpropagation in Practice Details 58:40
Radial Basis Function Networks; Gaussian RBF networks Details 58:4
Learning Weights in RBF networks; K-means clustering algorithm Details 59:2
Support Vector Machines — Introduction, obtaining the optimal hyperplane Details 58:54
SVM formulation with slack variables; nonlinear SVM classifiers Details 0:59
Kernel Functions for nonlinear SVMs; Mercer and positive definite Kernels Details 58:45
Support Vector Regression and ?-insensitive Loss function, examples of SVM learning Details 58:40
Overview of SMO and other algorithms for SVM; ?-SVM and ?-SVR; SVM as a risk minimizer Details 58:29
Positive Definite Kernels; RKHS; Representer Theorem Details 58:46
Feature Selection and Dimensionality Reduction; Principal Component Analysis Details 59:14
No Free Lunch Theorem; Model selection and model estimation; Bias-variance trade-off Details 59:53
Assessing Learnt classifiers; Cross Validation; Details 59:50
Bootstrap, Bagging and Boosting; Classifier Ensembles; AdaBoost Details 59:31
Radial Basis Function Networks; Gaussian RBF networks Details 58:4
Linear Least Squares Regression; LMS algorithm Details 58:16

Course Reviews

N.A

ratings
  • 5 stars0
  • 4 stars0
  • 3 stars0
  • 2 stars0
  • 1 stars0

No Reviews found for this course.

FreeVideoLectures.com All rights reserved.

Setup Menus in Admin Panel