Exam Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

What is a classification problem

A

A problem that requires machine learning algorithms that learn how to assign a class label to examples from the problem domain

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a regression problem

A

A problem that learns to predict continuous variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What algorithms are used for Regression Problems?

A
  • Linear Regression
  • Support Vector Regression
  • Regression Tree
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Give an example of a Classification problem

A

Getting a machine to classify different images such as the difference between apple[1,0,0], banana[0,1,0] and cherry[0,0,1]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is Underfitting?

A

When a model cannot capture underlying trend of the data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Why does Underfitting occur?

A

Algorithm does not fit/ Not enough data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What happens with the Bias& Variance in Underfitting

A

High bias and low variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is Bias?

A

Assumptions made by a model to make a function easier to learn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is Variance?

A

Training data obtains a low error, and then changing training data obtains a high error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How to prevent Underfitting

A

Increase model complexity
Increase number of features (feature engineering)
remove noise
Increase epochs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is overfitting?

A

Trained with a lot of data, the model starts to learn from the noise and inaccurate data entries. The model has too much freedom and builds an unrealistic model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is overfitting in terms of variance and bias

A

High variance and low bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How to reduce overfitting

A

Increase training data
reduce model complexity
early stopping
L1&L2 regularization
Dropouts if neural network

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is regularisation

A

the technique of calibrating machine learning models to minimize the loss and prevent over or underfitting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What noise mean?

A

The data points in a dataset that don’t really represent the true properties of your data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What does Bias mean in terms of regularisation?

A

the difference between the actual and predicted values Less consideration to data pattern = oversimplified and underfit models

17
Q

What does Variance mean in terms of regularisation

A

Measure of flexibility in the model. Decides how sensitive the model is to change based on the patterns in the input data

18
Q

What happens to the training and testing error when the bias is high

A

They will also be high

19
Q

What happens to the training and testing error when the variance is high

A

They will be low

20
Q

Name the two main types of regularization techniques

A

Ridge and Lasso Regulation

21
Q

What is Ridge regularisation

A

Modifies over or underfitted models by adding the penalty equivalent to sum of the squares of the magnitude of coefficients

22
Q

what is Lasso Reggression

A

Modifies the over fitted/underfitted models by adding a penalty = to the sum of the absolute values of coefficients

23
Q

What is Dropout In regularisation

A

Randomly selected neurons are ignored during training. Dropped out randomly. therefore their contribution is temporally removed

24
Q

What happens as a neural network learns

A

Weights settle into their context within the network. Weights are tuned for specific features, providing some specialization. Neighboring neurons come to rely on this specializations which can result in a fragile model too specialized for training the data.

25
Q

How does dropout help with overfitting

A
  1. Neurons cannot rely of one input as it may dropout at random - this reduces Bias due to over-relying on one input
  2. neurons will not learn redundant details of inputs
26
Q

The concept of concept attainment requires the following 5 categories

A
  1. identify task
  2. nature of examples used
  3. validation procedure
  4. consequences of categorizations
  5. nature of imposed restriction
27
Q

what is an decision tree

A

A supervised learning algorithm (regression and classification) Tree structure with roots, nodes and branches like a flowchart

28
Q

Advantages of decision trees

A
  • easy to interpret
    -no data preparation required
    -more flexible
29
Q

Disadvantages of decision trees

A

-prone to overfitting
-high variance
-more costly

30
Q
A