SECTION 2: Multi-layer perceptrons (MLP) Flashcards

1
Q

Describe a simple multi-layer perceptron. (2p)

A

An MLP has an input layer, an output layer and at least one hidden layer (the simplest version of MLP), and uses the Sigmoid function as well as is enabled to backpropagation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does its structure allow it to compute that a single-layer perceptron can’t? (2p)

A

Linearly non-separable problems. That is, problems where classification can’t be made by using only one decision boundary.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Which of the following, given appropriate connection weights, can a multi-layer perceptron linearly classify? (may be more than one correct answer) (2p)

A. AND-gated inputs
B. OR-gated inputs
C. XOR-gated inputs
D. Not XOR-gated inputs

A

All of them. (But could have to be transformed from linearly non-separable to linearly separable first.)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Explain how a Multi-Layer Perceptron can transform a linearly non-separable problem into a linearly separable one. (3p)

A

An MLP can use the hidden layer to transform the input (from a linearly non-separable) to a linear separable output.

One example of this is the XOR-gate/problem where the hidden nodes “takes one each” of the logical functions OR and AND.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Explain the advantages and disadvantages of having many hidden nodes in a hidden layer. (3p)

A

Up to a certain point more hidden nodes give you a faster learning, because you increase the solution pace. But it can also make the learning slower, because the network has to take more parameters into account for the calculations

ADVANTAGES
+ bigger solution space
+ can handle complex problems

DISADVANTAGES

  • takes more time to train
  • increased risk of credit assignment error
How well did you know this?
1
Not at all
2
3
4
5
Perfectly