Extra Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Why do we want non-polynomial functions

A

because polynomial functions have multiple low or high points on the graph = multiple answers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

If we have an error of 500 in regression how do we correct that?

A

Gradient descent will use backpropergation to adjust the theta values to minimize the output by 500

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Why do we square cost functions in regression

A

we want to avoid some data in the sum function to cancel out other values as they might be negative. (when adding to the sum function)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Is the XOR function linear or non-linear?

A

It is of course, non-linear it.

It returns a binary result, 1 or 0, dependent on whether

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is key and value in a dictionary in terms of Markov Chains?

A

In Markov Chains the key is the word you want to generate the following word for, and the value is a list of possible words that can follow.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a decision tree?

A

It is a flow-chart like model, where each node is a test on an attribute. e.g. left is yes and right is no.

It is used to make decisions.

Supervised learning as you deal with decisions that are predetermined and you know the cases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the random forest algorithm

A

It is a classification and regression algorithm for learning. It is a supervised learning method.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is K-nearest neighbour?

A

it is a classification algorithm that assumes that values with similar values are similar

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

When a Deep Neural Network model learn, what is it essentially the method of how it learns?

A

It minimizes which weight and biases minimize a certain cost function (sum of error)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is backpropagation

A

Backpropagation is how we adjust our weights after we have identified how much error we have with the cost functions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

If your model is 100 % exact on your training set what does that imply?

A

It implies that something is wrong and that you need to look into why

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What can you do with your model if your model is predicting too precisely to the training set?
Think in terms of nodes.

A

You can remove some layers, causing the DNN to have less of a chance of getting the exact weights.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What can you do if you want to make it harder for your model to predict too closely to the training set

A

You can add regularization which is basically taking the sum of the weights and dividing that with the cost function. Then whenever the model changes the weights to get a better prediction the change in the weights are not accounted for making it less precise.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Why dont you want your model to predict 100 % precise

A

You dont want it to not be able to predict outside of the training set. so outside with new values. It can have learned the errors as features also

How well did you know this?
1
Not at all
2
3
4
5
Perfectly