Back close

Course Detail

Course Name Deep Learning Lab
Course Code 26CSA686
Credits 1
Campuses Amritapuri, Mysuru

Syllabus

  1. CNNs for Hand-written digit recognition using Tensor flow.  
  2. CNNs for Hand-written digit recognition using Keras.  
  3. Simple image classification with Inception Model.  
  4. Demonstrate use of GoogleNet and Hyper-parameter Optimization..  
  5. Demonstrate use of AlexNet and Hyper-parameter Optimization.  
  6. Create CONV layer of a CNN.  
  7. Display details of CONV layer of a CNN.  
  8. Demonstrate use of Stride and Pad for CONV layers of a CNN.  
  9. Neuron view of the convolution layer.  
  10. RELU in CNNs.  
  11. Pooling and fully connected layers in CNNs  
  12. Classify movie reviews — binary classification using Keras.  
  13. Python Code: RNNs for Hand-written digit recognition using Tensorflow  
  14. Python Code: Bi-directional RNNs for Hand-written digit recognition using Tensorflow  
  15. Python Code: Next word prediction using RNNs 

Objectives and Outcomes

Course Objectives 

Introduce major deep learning algorithms, the problem settings, and their applications to solve real world problems.  

Course Outcomes 

COs 

Description 

CO1 

Demonstrate deep learning and the main research in this field.  

CO2 

Design and implement deep neural network systems.  

CO3 

Use neural networks for various application domains.  

CO4 

Use deep Learning technologies throughout most of machine learning pipeline.  

CO5 

Develop algorithms for resolving real world problems.  

CO-PO Mapping 

PO/PSO 

PO1 

PO2 

PO3 

PO4 

PO5 

PO6 

PO7 

PO8 

CO 

CO1 

– 

– 

– 

– 

CO2 

   

– 

– 

CO3 

– 

– 

– 

– 

CO4 

     

– 

– 

CO5 

   

– 

– 

– 

Textbooks / References

  • Domingos, Pedro. “A few useful things to know about machine learning.” Communications of the ACM 55.10 (2012): 78-87.  
  • Li Fei-Fei (Stanford), Rob Fergus (NYU), Antonio Torralba (MIT), “Recognizing and Learning Object Categories” (Awarded the Best Short Course Prize at ICCV 2005).  
  • Baydin, AtilimGunes, Barak A. Pearlmutter, and Alexey AndreyevichRadul. “Automatic differentiation in machine learning: a survey.” arXiv preprint arXiv:1502.05767 (2015).  
  • Bengio, Yoshua. “Practical recommendations for gradient-based training of deep architectures.” Neural Networks: Tricks of the Trade. Springer Berlin Heidelberg, 2012. 437-478.  
  • LeCun, Yann A., et al. “Efficient backprop.” Neural networks: Tricks of the trade. Springer Berlin Heidelberg, 2012. 9-48.  

DISCLAIMER: The appearance of external links on this web site does not constitute endorsement by the School of Biotechnology/Amrita Vishwa Vidyapeetham or the information, products or services contained therein. For other than authorized activities, the Amrita Vishwa Vidyapeetham does not exercise any editorial control over the information you may find at these locations. These links are provided consistent with the stated purpose of this web site.

Admissions Apply Now