Activation Functions Flashcards

1
Q

What step occurs after the weighted summation?

A

The activation function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does the activation function do?

A

provides an ouput number

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Why must the activation function be non-linear?

A

Because you can’t transform the way data is represented with a linear function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What does the output from the first hidden layer become?

A

the new input for the second hidden layer

This process is repeated until the final output of the neural network is achieved

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are the four primary types of activation function?

A
  1. Sigmoid function
  2. Tanh function
  3. ReLU function
  4. Leaky ReLU function
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Sigmoid function

A

-No constraint on the input, can be positive or negative (x axis)
-Output can be bounded between 0 and 1(y axis)
-Use this if you want the output between 0 and 1 (i.e. a probability), such as image identification

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Tanh function

A

-no constraint on input
-output bounded by -1 and 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

ReLU function

A

-Only when the weighted sum (which is the x axis) becomes larger than 0 will this outcome occur - known as a threshold function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

*Leaky ReLU function

A

-When x increases the y increases a little bit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What does a neural network learn to do in the hidden layers?

A

to approximate any transformation function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

transformation function

A

the combination of the activation functions the neural network is doing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What function can you use when you don’t want a negative ouput?

A

sigmoid or ReLU function

e.g. calculating the cost of a house

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Which function should you use for the stock market?

A

tanh function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What function would you use if asking AI to identify an object?

A

sigmoid function because constrained between 0 and 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

sigmoid function conceptual explanation

A

-When your weighted sum becomes large, the y value approaches 1.
-The larger the weighted sum, the more confident the activation function becomes.
-If the weighted sum becomes very small, the output will be close to 0 so the probability is very low.
-If weighted sum adds to 0, then its 50/50 - it has no idea

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What happens when the e value becomes very large?

A

the probability is driven down