mla Flashcards

1
Q

The two phases of supervised ML process: Training, ________.

A

Prediction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

These concepts helps to understand how well a model performs: Overfitting, Underfitting, _________.

A

generalization

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

When the model fits too closely to the training dataset.
Group of answer choices

Overfitting

Generalization

Underfitting

A

Overfitting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

In supervised learning, market trend analysis is an example of:
Group of answer choices

Regression

Correlation

Prediction

Classification

A

Regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Logistic Regression is an example of a regression algorithm.
Group of answer choices

True

False

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The _____ refers to the error from having wrong / too simple assumptions in the learning algorithm.

A

bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

If your model performs well on the training set but poorly on the validation set.
Group of answer choices

Underfitting

Generalization

Overfitting

A

Overfitting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

There is a regression variant of the k-nearest neighbors algorithm.
Group of answer choices

True

False

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

In k-NN, when you choose a small value of k (e.g., k=1), the model becomes more complex.
Group of answer choices

True

False

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

In k-NN, High Model Complexity is underfitting.
Group of answer choices

True

False

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

The ‘k’ in k-Nearest neighbors refers to the new closest data point.
Group of answer choices

True

False

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

In k-NN, High Model Complexity is:
Group of answer choices

Overfitting

Underfitting

A

Overfitting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

K-nearest neighbors make a prediction for a new data point by finding the data that match from the training dataset.
Group of answer choices

True

False

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

In k-NN, Euclidean distance (by default) is used to choose the right distance measure.
Group of answer choices

True

False

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

In Ridge regression is α (alpha) is lesser, the penalty becomes larger.
Group of answer choices

True

False

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Linear Regression is also known as Ordinal Least Squares.
Group of answer choices

True

False

A

FALSE

17
Q

The ‘slope’ parameter is also called _______ or coefficients.
Group of answer choices

Weight

Length

Mean

Median

A

Weight

18
Q

Linear models make a prediction using a linear function of the input features.
Group of answer choices

True

False

A

TRUE

19
Q

Lasso uses L1 Regularization.
Group of answer choices

True

False

A

TRUE

20
Q

Ridge regression is a linear regression model that controls complexity to avoid overfitting.
Group of answer choices

True

False

A

True

21
Q

Its primary objective is to map the input variable with the output variable.
Group of answer choices

Correlation

Classification

Unsupervised Learning

Supervised Learning

A

Supervised Learning

22
Q

Dichotomous classes means Yes or No.
Group of answer choices

True

False

A

TRUE

23
Q

A model that performs poorly on both training and new data because it hasn’t learned enough from the training data.
Group of answer choices

Underfitting

Generalization

Overfitting

A

Underfitting

24
Q

Classification algorithms address classification problems where the output variable is categorical.

A

TRUE

25
Q

There is a regression variant of the k-nearest neighbors algorithm.
Group of answer choices

True

False

A

TRUE

26
Q

In k-NN, when you choose a small value of k (e.g., k=1), the model becomes more complex.
Group of answer choices

True

False

A

TRUE

27
Q

When comparing training set and test set scores, we find that we predict very accurately on the training set, but the R2 on the test set is much worse. This is a sign of underfitting.
Group of answer choices

True

False

A

FALSE

28
Q

The ________ is the sum of the squared differences between the predictions and the true values.
Group of answer choices

Median error

Mean error

Total R

Mean Squared Error

Not in the options

A

Mean Squared Error

29
Q

Ridge is generally preferred over Lasso, but if you want a model that is easy to analyze and understand then use Lasso.
Group of answer choices

True

False

A

TRUE