Neural Networks Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

What is a feedforward network?

A

A network where the inputs are put through multiple layers, every layer doing calculations before sending the input to the next layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How do backpropagation work?

A
  • You run one feedforward cycle
  • compare expected output to the output (y-hat)
  • calculate the error between them
  • Run backprop = see which neurons are doing a bad job predicting and decrease their weights and vice versa for good neurons
  • repeat
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is overfitting?

A

Your model is performing well on the training data but poorly on test data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is underfitting?

A

Your model is performing poorly on train data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

When do you want to use L1 regularization?

A

L1 is better when you want to find the best features with fewer weights - it is more sparse

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

When do you want to use L2 regularization?

A

L2 is better for training neural models since it tries to optimize and keep all weights

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Why use dropout?

A

Because dropout prevents some neurons to become overly decisive in the decision-making.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is Stochastic Gradient Descent?

A

SGD allows you to divide the train data in each epoch into batches. You run each batch through the network, update the weights (backprop) and then repeat until all batches has run - you then run the next epoch

How well did you know this?
1
Not at all
2
3
4
5
Perfectly