Neural Networks Perceptron Flashcards

Exam2

1
Q

What activation function does Perceptron use?

A

Step Function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Can a single perceptron find a non-linear decision boundary?

A

NO; it can only activate in 4 situations that are all linear decision boundaries

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the XOR Problem for the Perceptron? and how was it ultimately solved?

A

XOR problem has a nonlinear decision boundary that a single layer perceptron is not capable of solving. It was solved by using a multi-layer perceptron (neural network) with at least one hidden layer- allowing the network to learn complex nonlinear relationships and accurately model the XOR function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Is a multi-layer perceptron a neural network?

A

Yes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the input to an activation function at the first layer of a neural network?

A

the weighted sum of the input features (ex: price, shipping cost, marketing, etc.)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How is a neuron/unit activated in a neural network? Will all units be activated?

A

Each neuron is activated based on feature input. Whether all units are activated or not depends on if the weighted sum of the inputs is greater than a certain threshold.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Why are neural networks referred to as a black box?

A

Because of the complexity of the models, it’s hard to trace exactly what features each neuron is working with or what relationship it finds; we don’t understand how it actually comes up with it’s predictions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What types of problems outperform other machine learning models like linear regression, decision trees, and SVM?

A

Natural Language Processing, Image Recognition, Speech Recognition

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What role does back propagation play in neural networks?

A

This training algorithm is used to find the optimal coefficients of a neural network

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Describe back propagation

A

Uses the derivative of the cost function to determine how much each coefficient at the output layer contributed to the error.
Measures how much error came from each neuron in the layer below, which measures the gradient across all neurons and layers.
Algorithm does the gradient descent step and tweaks each of the coefficients.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly