Lecture 8 - Deep Learning Basics Flashcards

1
Q

What is machine learning for visual perception?

A

Machine learning for visual perception involves discovering and leveraging patterns in images to make predictions or classifications based on visible patterns. It includes tasks like segmentation, detection, and recognition.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Describe supervised learning.

A

Supervised learning involves training a model with labeled examples, where each example has both features (images) and corresponding labels related to the task, such as segmentation or recognition.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is unsupervised learning?

A

Unsupervised learning involves modeling, representing, or describing the variability within data without specific labels. It is used for tasks like noise removal, super-resolution, and deconvolution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Explain the concept of a cost function in machine learning.

A

A cost function is used to define “best” parameters for a model by measuring the error between the model’s predictions and the ground truth labels. Examples include cross-entropy for classification and mean squared error for regression.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is a perceptron?

A

A perceptron is a simple linear classifier used for binary classification. It consists of an activation function (linear transformation of inputs) and a nonlinearity (e.g., sigmoid function) to produce the output.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Describe the architecture of a multilayer perceptron (MLP).

A

An MLP consists of an input layer, one or more hidden layers, and an output layer. Each layer is composed of neurons that apply a linear transformation followed by a non-linear activation function, enabling the network to model complex functions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the purpose of backpropagation in training neural networks?

A

Backpropagation is an algorithm used to train neural networks by computing the gradient of the cost function with respect to each weight and updating the weights to minimize the cost function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Explain the role of the activation function in neural networks.

A

The activation function introduces non-linearity into the network, allowing it to learn and model complex relationships between inputs and outputs. Common activation functions include sigmoid, tanh, and ReLU.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is a fully connected layer in a neural network?

A

A fully connected layer is a layer where each neuron is connected to every neuron in the previous layer, enabling complex interactions between features and contributing to the network’s ability to learn representations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Describe the softmax function used in neural networks.

A

The softmax function is used in the output layer of a neural network for multi-class classification. It converts raw output scores (logits) into probabilities by exponentiating the scores and normalizing them.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the difference between a training set, validation set, and test set?

A

The training set is used to learn model parameters, the validation set is used to tune hyperparameters and prevent overfitting, and the test set is used to evaluate the model’s generalization performance on unseen data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Explain the concept of overfitting in machine learning.

A

Overfitting occurs when a model learns to perform well on the training data, including its noise and outliers, but fails to generalize to new, unseen data, resulting in poor performance on the test set.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Provide the formula for the gradient descent update rule.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Write the cost function for logistic regression.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Write the formula for the output of a neuron in an MLP.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How does supervised learning differ from unsupervised learning?

A

Supervised learning uses labeled data to train models for specific tasks, whereas unsupervised learning uses unlabeled data to find patterns or representations without a predefined task.

17
Q

What are some common activation functions used in neural networks?

A

Common activation functions include sigmoid, tanh, and ReLU (Rectified Linear Unit).

18
Q

Explain the purpose of the softmax function in a neural network.

A

The softmax function converts the raw output scores of a neural network into probabilities, making it suitable for multi-class classification tasks.

19
Q

Why is backpropagation important in training neural networks?

A

Backpropagation efficiently computes the gradient of the cost function with respect to each weight, allowing for effective optimization and training of deep neural networks.

20
Q

What is the role of the cost function in training neural networks?

A

The cost function measures the error between the predicted outputs and the actual labels, guiding the optimization process to find the best model parameters.

21
Q

How do you prevent overfitting in a neural network?

A

Overfitting can be prevented by using techniques such as cross-validation, regularization, dropout, early stopping, and increasing the size of the training set.

22
Q

Describe the process of gradient descent in training neural networks.

A

Gradient descent involves iteratively updating the model parameters in the direction of the negative gradient of the cost function to minimize the error.

23
Q

What is the significance of using multiple layers in a neural network?

A

Multiple layers allow the network to learn hierarchical representations, enabling it to model complex patterns and functions that a single layer cannot capture.

24
Q

How does a validation set differ from a test set?

A

The validation set is used to tune hyperparameters and select the best model during training, while the test set is used to evaluate the final model’s performance on unseen data.

25
Q

Explain the concept of “dropout” in neural networks.

A

Dropout is a regularization technique that randomly sets a fraction of the neurons to zero during training, preventing the network from becoming too reliant on specific neurons and reducing overfitting.

26
Q

What are the challenges of training deep neural networks?

A

Challenges include vanishing and exploding gradients, overfitting, high computational cost, and the need for large amounts of labeled training data.

27
Q

Describe the role of the learning rate in gradient descent.

A

The learning rate determines the step size for each iteration of gradient descent. A too high learning rate can cause divergence, while a too low learning rate can lead to slow convergence.