Syllabus
Unit 1
Perceptrons – classification – limitations of linear nets and perceptrons – multi-Layer Perceptrons (MLP); Activation functions – linear, softmax, tanh, ReLU; error functions; Feed-forward networks – Backpropagation – recursive chain rule (backpropagation); Learning weights of a logistic output -Loss functions – learning via gradient descent; Optimization – momentum method; Adaptive learning rates – RMSProp – mini-batch gradient descent; Bias-variance trade off – Regularization – overfitting – inductive bias – drop out – generalization.
Unit 2
Convolutional Neural Networks – Basics and Evolution of Popular CNN architectures; CNN Applications: Object Detection and Localization, Face Recognition, Neural Style Transfer
Recurrent Neural Networks – GRU – LSTM – Transformers Networks; Applications: NLP and Word Embeddings, Attention Models,
Unit 3
Restricted Boltzmann Machine, Deep Belief Networks, Auto Encoders and Applications: Semi-Supervised classification, Noise Reduction, Non-linear Dimensionality Reduction; Introduction to GAN – Encoder/Decoder, Generator/Discriminator architectures; Challenges in NN training – Data Augmentation – Hyper parameter Settings; Transfer Learning – Developing and Deploying ML Models (e.g., Tensor Flow/PyTorch)
Evaluation Pattern
Evaluation Pattern: 70:30
Assessment |
Internal |
External |
Midterm |
20 |
|
Continuous Assessment – Theory (*CAT) |
10 |
|
Continuous Assessment – Lab (*CAL) |
40 |
|
**End Semester |
|
30 (50 Marks; 2 hours exam) |
*CAT – Can be Quizzes, Assignments, and Reports
*CAL – Can be Lab Assessments, Project, and Report
**End Semester can be theory examination/ lab-based examination/ project presentation