Chapter 11 Quiz Flashcards

1
Q

model that mimics the way human experts learn, nodes/neurons

A

neural networks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

input layer, hidden layer, output layer

A

multilayer feedforward networks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

constant that controls the level of contribution of a node

A

bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

monotone function used on sum of node before

A

activation/transfer function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

what scale does the model perform best on?

A

0 to 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

how to scale

A

value-a divided by b-a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

using model errors to update weights, errors are computed from the last layer back to the hidden layers

A

back propagation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

weights updated after each observation if run through network (trial)

A

case updating

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

entire training set is run through network before each updating of weights takes place

A

batch updating

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

multiplying factor for error correction during back propagation

A

learning rate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

down weights new information to avoid overfitting

A

weight decay

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

convergence of weights to the optimum

A

momentum

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

one run through of training data

A

epoch

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

complex networks with many layers, incorporating processes for dimension reduction and feature discovery

A

deep learning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

selects subset of predictors and applies the same operation to the entire subset

A

convolution neural networks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

main features of neural networks

A

black box, flow of data (no cycle), fully connected connections, parametric, nonlinear

17
Q

popular output layer activation, converts to probabilities

18
Q

midrange scaling

A

midrange = (max-min)/2
(x-midrange)/(range/2)

19
Q

what is the goal of weights and biases (parameters)?

A

minimizing prediction error

20
Q

when to stop training (three situations)?

A

of epochs predetermined is reached

new weights computed are not too different from the previous values
target misclassification is reached
number of epochs predetermined is reached

21
Q

what happens if there is more than one hidden layer?

A

too many results causes overfitting and high cost

22
Q

how many hidden layers does deep learning have?

A

2 or more hidden layers

23
Q

how to determine size of hidden layer?

A

start with number of predictors and then increase/decrease

24
Q

model that can deal with any input/output model

A

universal approximator

25
Q

what are the circles and lines?

A

circles are neurons
lines are parameters

26
Q

how is complexity optimized?

A

controlling magnitudes of weights