Deep Learning Flashcards

1
Q

Step function

A

Ystep =

1 if X >= 0
0 if X < 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Sign function

A

Ysign =

+1 if X >= 0
-1 if X < 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Sigmoid function

A

Ysigmoid =

1

1+e-x

( 1 if X >= 0
0 if X < 0)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Linear function

A

Ylinear = X

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Tanh function

A

Ytanh =

ex - e-x
————
ex + e-x

( +1 if X >= 0
- 1 if X < 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Number of parameters formula

A

1st convo layer
f * f * nc * nf + nf

After 1st convo layer
f * f * pnf * nf + nf

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Output size formula

A

n + 2p - f n + 2p - f
———— + 1 * ————— + 1
s s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Derivates rules

A

x3 + 2x + 3

x3 = 3x2 (bring power down and -1)
2x = 2 (remain number, x = 1)
3 = 0 (any constant on own becomes 0)

(Anything attached by * will remain as it is)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Euclidean distance

A

Square root of (a - b)2 + ….

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Manhattan distance

A

Absolute value of a - b + ……
( |a-b| + …

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Neural network output

A

Activation function(Total input * weights - bias)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Learning a perceptron

A

e(p) = Yd(p) - Y(p)

( if error (e(p) is positive, we need to increase perceptron output (Y(p)). If negative, decrease Y(p)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Relu function

A

Y = max(0,x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Leaky ReLu

A

Y = max(0.1, x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly