Chapter 1 is the introduction to the course. Will do Chapter 2 at the end of the course.

Chapter 3: Linear Learning Machines

Linear Regression Classification Logistic Regression Multi-class Logistic Regression

Chapter 4: Nearest Neighbor Methods

Nearest-Neighbor Method K-Nearest Neighbor Method

Chapter 5: Classical Artificial Neural Network

Multi-Layer Perceptron

Chapter 6: Support Vector Machines

Optimal Hyperplane Generalized Optimal Hyperplane Kernel Trick Usage of SVMs Statistical Learning Theory

Chapter 7: Feature Selection and Extraction for Classification

Separability of Classes Feature Extraction

Chapter 8: Decision Trees, Random Forest, and Ensemble Learning

Decision Trees Ensemble Learning

Chapter 9: Bayesian Classifier

Basic Bayesian Statistics Bayesian Rule Forms Bayesian Decision with Gaussian Distribution Markov Chain Naive Bayes Classifier Neyman-Pearson Criterion

Chapter 10: Probability Density Estimation

We need this since we don’t know the probability density. So, to use Bayesian decision, we need to estimate the probability density from the training data. There are two types of estimation, parametric, and non-parametric.

Parametric estimation: Maximum Likelihood Estimation Bayesian Estimation To do this, the data must be normal. If there is no good model, then ditch it and use non-parametric estimation instead, which includes KNN method.

Chapter 11: HMM & Graphic Models

Chapter 12: Manifold Learning

Principal Component Analysis t-SNE