Académique Documents
Professionnel Documents
Culture Documents
Course Title
Hours / Weeks
315MCE
L
3
T
0
P
0
Credits
Maximum Marks
C
3
CA
50
EA
50
PREREQUISITES:
OBJECTIVES:
UNIT I
Learning
INTRODUCTION
Design
of
Learning
system -
9
Types
of
machine
learning
Applications
Probability Theory -Probability densities Expectations and covariance -Bayesian probabilities - Gaussian
distribution-Curve tting re-visited -Bayesian curve tting Probability distributions -Decision TheoryBayes Decision Theory - Information Theory
UNIT II
SUPERVISED LEARNING
Linear Models for Regression - Linear Basis Function Models - Predictive distribution Equivalent kernelBayesian Model Comparison-Evidence Approximation - Effective number of parameters - Limitations of
Fixed Basis Functions - Linear Models for Classification Nave Bayes - Discriminate Functions
-Probabilistic Generative Models -Probabilistic Discriminative Models - Bayesian Logistic Regression.
Neural Networks Feed-forward Network Functions - Back- propagation.
UNIT III
UNSUPERVISED LEARNING
Factor analysis - Principal Component Analysis - Probabilistic PCA- Independent components analysis
Total
100
UNIT IV
Graphical Models - Undirected graphical models - Markov Random Fields - Directed Graphical Models
-Bayesian Networks - Conditional independence properties - Inference Learning- Generalization Hidden Markov Models - Conditional random fields(CRFs)
UNIT V
REINFORCEMENT LEARNING
Reinforcement Learning- Introduction Single State Case: K-Armed Bandit - Elements of Reinforcement
Learning - Model-Based Learning - Value Iteration - Policy Iteration - Temporal Difference Learning Exploration Strategies - Deterministic Rewards and Actions - Nondeterministic Rewards and Actions Eligibility Traces - Generalization Partially Observable States - The Setting - The Tiger Problem
TOTAL: 45
REFERENCES
1. Ethem Alpaydin, Introduction to Machine Learning, Prentice Hall of India, 2014
2. Christopher Bishop, Pattern Recognition and Machine Learning Springer, 2006
3. Kevin P. Murphy, Machine Learning: A Probabilistic Perspective, MIT Press, 2012
4. Tom Mitchell, "Machine Learning", McGraw-Hill, 1997.
5. Hastie, Tibshirani, Friedman, The Elements of Statistical Learning (2nd ed)., Springer, 2008
6. Stephen Marsland, Machine Learning An Algorithmic Perspective, CRC Press, 2009
COURSE OUTCOMES:
To implement a neural network for an application of your choice using an available tool
To implement probabilistic discriminative and generative algorithms for an application of your
choice and analyze the results
To use a tool to implement typical clustering algorithms for different types of applications
To design and implement an HMM for a sequence model type of application