Neural Network & Deep Learning

Introduction to Neural Network and Deep Learning, Mathematics for Deep Learning : Linear Algebra part 1, Math for Deep Learning : Functions and Convex optimization, Introduction to Loss Functions, Neural Networks deconstructed for Supervised Learning (Classification), Building Blocks of Neural Networks, TensorFlow, Keras, and Tensorboard, Babysitting the Neural Network

Introduction to Neural Network and Deep Learning

1.       Introduction to Neural Networks

2.       NN History

3.       Why data-driven?

4.       Handson Python demo of K-NN classifier on MNIST dataset

5.       The choice of distance is a Hyperparameter

6.       K-NN never used on images

7.       Parametric Approach

8.       Linear Algebra_Scalars, vectors, matrices, tensors

9.       Algebra_Linear operations on vectors and matrices

 

Mathematics for Deep Learning : Linear Algebra part 1

1.       Scalars, Vectors, Matrices, And Tensors

2.       Linear operations on Vectors and Matrices

3.       Vector Properties: Vector norms, some special vectors and matrices

4.       Hands-on_python demo: vector and matrix properties

Hands On Lab

1.       LinearAlgebra.ipynb

2.       Linear Algebra.pptx

3.       Matrices

4.       Line Concept

5.       DLCP Project 1 Brief.pdf

6.       KNN_SVHN.ipynb

7.       Link to the dataset

Math for Deep Learning : Functions and Convex optimization

1.       GPU Cloud Labs : Google Colabs

1.       Working with Google Colabs Part1: Getting Started

2.       Working with Google Colabs Part2: Mounting Data

2.       Functions and derivatives part 1

3.       Functions and derivatives part 2

4.       Optimizing a continuous function part 1

5.       Optimizing a continuous function part 2

6.       Hands-on Python demo – Gradient descent

7.       GradientDescent2.ipynb

Introduction to Loss Functions

1.       Components of Supervised Machine learning

2.       Components of Supervised ML: Model, Parameters, and Hyper parameters

3.       Components of Supervised ML: Loss functions

4.       Examples of Loss functions: MSE vs MAE Loss for Regression

5.       Hands-on python demo: MSE loss function

6.       SimpleLossAndGradientDescent.ipynb

7.       Examples of loss functions for classification: MSE vs Cross Entropy loss

8.       Hands-on Python demo: MSE vs CE Loss functions

9.       SimpleLossAndGradientDescentV2.ipynb

10.   Regularization

Neural Networks deconstructed for Supervised Learning (Classification)

1.       Introduction to Neural Networks

2.       Activation functions

3.       Feed forward neural network

4.       Back propagation and Gradient descent

5.       Learning Rate setting and tuning

6.       Hands-on Python Demo: Building a Neural Network from Scratch

7.       NN_MNIST_Scratch_v1.ipynb

8.       Hands-on Python Demo: Visualizing binary decision boundaries

9.       NN from scratch with boundary visualized v1.ipynb

 

Building Blocks of Neural Networks

1.       Feed forward

2.       Back Propagation

3.       Fully Connected Layer – Forward pass

4.       Fully Connected Layer – Backward pass

5.       Activation Functions

6.       Activation functions – In practice

7.       Softmax

8.       Cross-Entropy Loss

9.       Hands-on-Python-demo

1.       MNIST walk-through of building blocks of NN

2.       MNIST Python Neural Network_Final.ipynb

10.   Hands On Lab

1.       DLCP Project 1 Brief.pdf

2.       SVHN Python Neural Network_Milestone2_Questions.ipynb

3.       SVHN Python Neural Network_Solution_notebook.ipynb

 

TensorFlow, Keras, and Tensorboard

1.       Introduction to Tensorflow

2.       Computational Graph

3.       Hands-on in TensorFlow: Linear regression on Boston Housing prices

4.       Introduction to Keras

5.       Build a Deep Neural Network in Keras: MNIST Dataset

6.       Using Tensorboard

7.       Codes used in the module

8.       Tensorflow_Hello_World.ipynb

9.       Boston_Housing_Prices.ipynb

10.   Classification_MNIST_DNN_Keras.ipynb

11.   Boston_Housing_Prices_KERAS.ipynb

Babysitting the Neural Network

1.       Introduction to babysitting the learning process

2.       Data Preprocessing

3.       Data Augmentation

4.       Weight initialization

5.       Regularization – Batch Normalization

6.       Regularization – Dropout

7.       Hands on python demo – Babysitting the neural network and Hyperparameter optimization

8.       Babysitting-MNIST Python Neural Network-FINAL.ipynb

9.       Visualizations

Project Work

1.       DLCP Project 1 Brief.pdf

2.       Babysitting-SVHN Python Neural Network_Questions.ipynb

3.       Babysitting_SVHN Keras Neural Network_Questions.ipynb

4.       Babysitting_SVHN Keras Neural Network_Solution_notebook.ipynb

5.       Babysitting_SVHN Python Neural Network_Solution_notebook.ipynb