1 Flashcards
Definition of Perceptron
Neural network unit that does certain computations to detect features in the input data
Hoes does the Perceptron work?
- Takes multimple inputs, m
- Multiplies every input by its corresponding weight, w
- Adds aditional learnable parameter - bias
- Sums all the values
- Passes the output through activation function in order get an output
What happens to Perceptron if the activation function does not exist?
Then it will only be a linear function
What is the downside of Perceptron? How it can be fixed?
- It is not powerful enough to represent complex, non-linear functions.
- Takes a lot of calculations
It could be fixed by stacking multiple artificial neurons into a structure called Artificial Neural Network
What is the difference between Forward pass and Backward pass in the ANN training cycle?
During Forward pass, we have all the inputs to calculate the output
During Backward pass, we have output data and we calculate gradients with respect to output and updating with weight to obtain the value of an input
What is a gradient of a loss function?
A loss function maps an event or values of one or more variables onto a real number intuitively representing some “cost” associated with the event. An optimization problem seeks to minimize a loss function
Did you learn to solve backward propogation?
1 - No, have to study
5 - I know it very well and can solve it without looking into the notes