Deep Learning Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

What is an activation function?

A

An “activation function” is a function applied at each node. It converts the node’s input into some output.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is RELU?

A

The rectified linear activation function (called ReLU) has been shown to lead to very high-performance networks. This function takes a single number as an input, returning 0 if the input is negative, and the input if the input is positive.

Here are some examples:
relu(3) = 3
relu(-3) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Good to Remember about loss function atleast the basics

A

When plotting the mean-squared error loss function against predictions, the slope is 2 * x * (y-xb), or 2 * input_data * error. Note that x and b may have multiple numbers (x is a vector for each data point, and b is a vector). In this case, the output will also be a vector, which is exactly what you want.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly