lecture 6 Flashcards

1
Q

What is the purpose of neural networks in machine learning?

A

To learn nonlinear decision boundaries by automatically extracting features.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the two approaches to making linear models more powerful?

A

Expanding features manually and using neural networks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a perceptron?

A

A simple type of artificial neuron that makes decisions based on weighted input.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the components of a perceptron?

A

Inputs, weights, bias, and an activation function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What activation function is commonly used in perceptrons?

A

Step function (sign function).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a limitation of perceptrons?

A

They can only learn linearly separable functions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the difference between a perceptron and a neuron in a neural network?

A

Neurons in a neural network use nonlinear activation functions, making them more expressive.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What activation functions are commonly used in neural networks?

A

Sigmoid, ReLU, and Softmax.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is a feedforward neural network?

A

A type of neural network where information moves in one direction, from input to output.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is a multilayer perceptron (MLP)?

A

A feedforward neural network with at least one hidden layer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the purpose of a hidden layer in an MLP?

A

To transform inputs into new representations that can model complex functions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why do neural networks need nonlinear activation functions?

A

Without them, a multi-layer network would collapse into a linear model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the sigmoid activation function?

A

A function that maps any real number into the range (0,1).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the ReLU activation function?

A

ReLU (Rectified Linear Unit) sets negative values to zero while keeping positive values unchanged.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the benefit of using ReLU over sigmoid?

A

ReLU mitigates the vanishing gradient problem and accelerates training.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is softmax activation used for?

A

For multi-class classification, converting scores into probabilities.

17
Q

What is backpropagation?

A

An algorithm for training neural networks by adjusting weights based on error gradients.

18
Q

What is the loss function in neural networks?

A

A function that measures how far the predictions are from the true values.

19
Q

What is stochastic gradient descent (SGD)?

A

An optimization method that updates weights using small batches of data to improve learning efficiency.

20
Q

What is the role of learning rate in gradient descent?

A

It controls how much the weights are adjusted at each step.

21
Q

What happens if the learning rate is too high?

A

The model may diverge and fail to converge to a solution.

22
Q

What happens if the learning rate is too low?

A

The model may take too long to converge or get stuck in local minima.

23
Q

What is the difference between batch gradient descent and stochastic gradient descent?

A

Batch GD uses the entire dataset for each update, while SGD updates weights using one example at a time.

24
Q

What is overfitting in neural networks?

A

When the model learns the training data too well, including noise, and performs poorly on unseen data.

25
Q

What is regularization in neural networks?

A

Techniques like dropout and weight decay to prevent overfitting.

26
Q

What is dropout?

A

A regularization technique that randomly disables some neurons during training to improve generalization.

27
Q

What is an epoch in neural network training?

A

One complete pass through the entire training dataset.

28
Q

What is the universal approximation theorem?

A

A neural network with at least one hidden layer can approximate any continuous function given enough neurons.

29
Q

How do neural networks compare to SVMs?

A

SVMs use kernel tricks to expand features, while neural networks learn their own feature representations.