Lecture 5 Flashcards

1
Q

Feature extraction

A

starts from an initial set of measured data and builds derived values (features) intended to be informative and non-redundant, facilitating the subsequent learning and generalization steps, and in some cases leading to better human interpretations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Neural Network approach to construct a non-linear classifier

A

Uses a large number of simpler activation functions

The functions are fixed (Gaussian, sigmoid, polynomial basis functions)

and optimization involves linear combinations of these fixed functions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Artificial Neural Networks

A

Take inspiration from the brain

define functions that are computed by neurons (units)

and have both input and output units with hidden layers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What’s the relationship between the number of hidden layers and the network’s capacity

A

the capacity of the network increases with more hidden units and hidden layers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are the components of neural networks

A

An input layer x; independent variable

an arbitrary amount of hidden layers

an output layer y (hat); dependent variable

a set of weights (coefficients) and biases at each layer

and a choice of activation functions for each layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Activation functions

A

Are applied on the hidden units and achieve non-linearity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are some popular activation functions?

A

Sigmoid, Tanh, and ReLU (Rectified Linear Unit)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the components of neural network training?

A

A forward pass and a backward pass

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

forward pass

A

performs inference

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

backward pass

A

performs learning;

is a routine to compute the gradient and uses the chain rule of the derivative of the loss function ;

weights and biases can be changed to reduce error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Back propagation

A

an efficient method for computing gradients needed to perform gradient-based optimization of the weights in a multi-layer network

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Deep neural network

A

a multilayer perceptron or a neural network with multiple hidden layers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Which of the following introduces non-linearity to a neural network?

Rectified Linear Unit (ReLU) function, Convolution function, or
Stochastic Gradient Descent

A

Rectified Linear unit, it is a non-linear activation function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly