Artificial Neural Networks Flashcards

Notes on ANNs that may help with the exam.

1
Q

What are the three types of layers in a Neural Network, and what do they aim to do?

A

Input Layer - Connects to the input data (typically features)

Hidden Layer - Performs feature interactions (weighted sums)

Output Layer - In charge of the decision making

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How does an MLP network function i.e. what are its components?

A

Lots of neurons are stacked in multiple layers to help learn a complex relationship.

A set of training examples are needed to train the network

Weights that connect neurons are updated to improve the prediction during training

After training, the network can be used to make predictions of unseen data quickly

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does each neuron contain?

A

Each neuron contains an activation function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What four activation functions are typically used in MLPs?

A

Sigmoid
Tanh
ReLU
Linear

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the activation function’s equation for Sigmoid?

A

σ(x) = 1/1+e^(-x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What problem is a Sigmoid Function used for in MLPs?

A

Binary Classification

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the activation function equation for Tanh?

A

tanh(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What problems does a tanh activation function solve in MLPs?

A

Classification problems - Normalises the data effectively and makes it easier for later layers to learn

Regression problems - If the target variable has positive and negative values, tanh ensures the output can match the range

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the equation for the activation function ReLU in MLPs?

A

max(0, x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What problems does the activation function ReLU solve in MLPs?

A

Non-Linear problems - Used to model non-linear relationships

Deep Neural Networks - Faster and more efficient training than other activation functions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the activation function equation for Linear in MLPs?

A

f(x) = x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What problem does the activation function Linear solve in MLPs?

A

Regression - Primarily used in the output layer when solving regression tasks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Why are activation functions useful to have in MLPs?

A

They introduce non-linearity, allowing neural networks to learn complex patterns in data. Without them, the network effectively becomes a linear model, regardless of layer amount.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the step-by-step process for training an ANN?

A
  1. Randomly initialise the weights and bias (W & b)
  2. Input data and perform forward passing, then generate a prediction
  3. Calculate the error between the predicted output and the true label
  4. Backpropagate the error from the output layer back to the input layer by calculating the averaged gradients of the batch for each W and b
  5. Update all W and b using the gradient (gradient descent)
  6. Repeat steps 2 - 5, until the output is good enough or W and b can’t be further updated.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are some hyperparameters of ANNs?

A

Learning Rate
Activation Function/s
Number of Layers i.e. Depth
Number of Neurons in each layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly