Deep Learning Flashcards
Step function
Ystep =
1 if X >= 0
0 if X < 0
Sign function
Ysign =
+1 if X >= 0
-1 if X < 0
Sigmoid function
Ysigmoid =
1
—
1+e-x
( 1 if X >= 0
0 if X < 0)
Linear function
Ylinear = X
Tanh function
Ytanh =
ex - e-x
————
ex + e-x
( +1 if X >= 0
- 1 if X < 0
Number of parameters formula
1st convo layer
f * f * nc * nf + nf
After 1st convo layer
f * f * pnf * nf + nf
Output size formula
n + 2p - f n + 2p - f
———— + 1 * ————— + 1
s s
Derivates rules
x3 + 2x + 3
x3 = 3x2 (bring power down and -1)
2x = 2 (remain number, x = 1)
3 = 0 (any constant on own becomes 0)
(Anything attached by * will remain as it is)
Euclidean distance
Square root of (a - b)2 + ….
Manhattan distance
Absolute value of a - b + ……
( |a-b| + …
Neural network output
Activation function(Total input * weights - bias)
Learning a perceptron
e(p) = Yd(p) - Y(p)
( if error (e(p) is positive, we need to increase perceptron output (Y(p)). If negative, decrease Y(p)
Relu function
Y = max(0,x)
Leaky ReLu
Y = max(0.1, x)