Unit 1 Fundamentals of Deep Learning Flashcards
What is the Deep learning?
- Subset of machine learning that mimics the way the human brain functions
- Talk about how deep learning is similar to the human brain
Example of Deep learning:
How a toddler learns what is a dog
Types of neural networks used in deep learning:
- Convolution neural network(CNN)
- Recurrent neural network (RNN)
- Long short term memory (LSTM)
What is a Perceptron?
- Basic unit used to build ANN
- Takes real valued input(not only boolean)
- Calculates linear combination of these inputs and generates output
- If result obtained from a perceptron is greater than threshold output is 1 otherwise -1
What is a multilayer perceptron?
Like neurons in a human brain perceptrons form dense network with each other and this dense network is known as multilayer perceptron
What is a feed forward neural network?
- Type of neural network consisting of three types of layers:
1. Input layer
2. Hidden layers
3. Output layer - Inputs are passed on in the forward direction
Types of Feed forward network:
- Single layer feed forward network (2 layers)
- Multilayer feed forward network (3 layers)
What is Back-propagation?
- Used in multilayered feed forward network
- algorithm used to adjust the weights and biasis by minimizing the error
- error is back propagated to the input layer and each neuron adjust its weights and biases according to it
- uses gradient decent for finding out the optimal weights and biases
(BPN)
What are weights?
Define the strength of connections between each of the neurons
(BPN)
What are biases?
Additional parameter that shifts the activation function to the left or the right
What is gradient decent?
- an iterative optimization algorithm
- aim is to minimise the cost function or error between predicted and actual value
- to find local global minimum(point of convergence) of a differentiable function
Process of Gradient descent:
- Arbitrary point selected
- Calculate slope at the point
- Update weights and bias yielded by slope
- Updation leads the slope to get steeper untill it reaches minimum value
- Large and small learning rate
Solutions to vanishing gradient decent:
- ReLU
- LSTM (constant gradient size)
- Gradient clipping
- Residual neural network
- Multi level hierarchy
What is vanishing gradient problem?
The slope of the gradient decent becomes steeper and stepper and reaches of minimum of almost 0
What is the activation function?
Desides whether neuron will fire or not for a given set of inputs, depending on a rule or threshold
It can be called as a mathematical gate between a input feeding the current neuron and the output going to the next layer