Module 1:
Fundamentals of Neural Networks – Types of Machine Learning, Types of Artificial Neural Networks – The McCulloch−Pitts Network, Perceptron, The Sigmoid Neuron, Linear Separable Problem, Multilayer Perceptron (MLP), Optimization Techniques, Exploding Gradient Problem, Weight Initialization, Deep Learning. Convolutional Neural Network (CNN) – Components of CNN Architecture, Rectified Linear Unit (ReLU) Layer, Exponential Linear Unit (ELU, or SELU), Unique Properties of CNN, Architectures of CNN, Applications of CNN.
Module 2:
Recurrent Neural Network (RNN) – Simple Recurrent Neural Network, LSTM Implementation, Gated Recurrent Unit (GRU), Deep Recurrent Neural Network. Autoencoder – Features of Autoencoder, Types of Autoencoder
Module 3:
Restricted Boltzmann Machine – Boltzmann Machine, RBM Architecture, Types of RBM.
Open-Source Frameworks of Deep learning – Python, TensorFlow, Keras, PyTorch.
Applications with Deep Learning – Image Classification using CNN, Visual Speech Recognition using 3D – CNN, Stock Market Prediction using RNN, Next Word Prediction using RNN-LSTM.
Suggested Lab Sessions:
· Implement and analyse perceptron and multilayer perceptron (MLP) models for linear and non-linear classification tasks. Python with TensorFlow or Keras.
· Build a CNN model to classify handwritten digits using the MNIST dataset. Include: Convolution, pooling, ReLU/ELU layers. Evaluation of accuracy and loss curves
· Build a simple RNN to predict the next value in a sine wave sequence. Implement an LSTM model for next word prediction using a sample text corpus.
· Develop an autoencoder to perform dimensionality reduction and reconstruction.