Quiz 3 Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Decision trees are an algorithm for which machine learning task?

A

Classification

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Root node

A

decision tree has no incoming edges and zero or more outgoing edges

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Internal node

A

has more than one incoming edge and one or more outgoing edges

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Terminal node

A

has exactly one incoming edge and no outgoing edge

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Internal node

A

denotes a test on a feature

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

In classification, a model or classifier

A

constructed to predict class (categorical) labels

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Homogeneous class distribution

A

preferred to determine the best split for decision trees

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Methods to look at node Impurity

A

Gini Index
Information Gain
Gain Ratio
Misclassification Error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the minimum and maximum value for entropy?

A

0,1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the minimum and maximum value of GINI?

A

0,0.5

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the minimum and maximum value for Misclassification Error?

A

0,0.5

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

The process of building a decision tree is recursive. It is built one node at a time and the algorithm is applied at each node the same way.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

During the process of building a decision tree, at each node, the algorithm decides which attribute to use for a new split based on some criteria evaluated for each possible attribute.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

During the process of building a decision tree, at each node, we consider the similarities between all the points and take the most similar ones to create the next nodes based on a criteria evaluated at each similar group of data.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are splitting criteria used in decision trees?

A

GINI index
Information Gain - Entropy
Misclassification Error
Gain Ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Building a tree that is too large or complex can cause

A

overfitting

17
Q

To reduce the size of a tree after it is built, use an algorithm for

A

pruning

18
Q

A decision tree tells how to classify instances. It can be turned into a list of rules. Which of the following are true?

A

Each leaf node gets turned into one rule

Each internal node after the root gets added to the rule as an and condition

19
Q

Overfitting in Machine Learning is defined as when a statistical model describes random error or noise instead of underlying relationship or when a model is excessively complex.

A

True