lecture 7 Flashcards

1
Q

What is deep learning?

A

A subfield of machine learning that focuses on training deep neural networks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How is a neural network represented?

A

As a graph where each node represents a scalar value computed from incoming edges.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the role of weights in a neural network?

A

Weights determine how much influence each input has on the output.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the feedforward network?

A

The simplest way of wiring up a neural net where data flows from input to output without loops.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Why is linear algebra important in neural networks?

A

Most operations in neural networks can be written as matrix multiplications.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are tensors in deep learning?

A

Generalizations of vectors and matrices to higher dimensions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the rank of a tensor?

A

The number of dimensions along which the values change.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is a scalar in tensor notation?

A

A rank-0 tensor.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is a vector in tensor notation?

A

A rank-1 tensor.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is a matrix in tensor notation?

A

A rank-2 tensor.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is a 3-tensor?

A

A three-dimensional array of numbers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is one common way to represent images in deep learning?

A

As a 3-tensor with width, height, and color channels.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is a computation graph?

A

A graph that details the computations of a model and its loss function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the purpose of a computation graph?

A

To allow automatic differentiation for backpropagation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is automatic differentiation?

A

A method for computing gradients by tracking computations in a computation graph.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What are the two approaches to automatic differentiation?

A

Lazy execution and eager execution.

17
Q

What is lazy execution?

A

A method where the computation graph is built first, then executed.

18
Q

What is eager execution?

A

A method where the computation graph is built on the fly during execution.

19
Q

What is backpropagation?

A

An algorithm for computing gradients in neural networks by propagating errors backward.

20
Q

How does backpropagation use computation graphs?

A

It walks backward through the graph to compute gradients.

21
Q

What is the difference between lazy and eager execution?

A

Lazy execution optimizes computation before running, while eager execution executes immediately but is harder to optimize.

22
Q

What is an example of a deep learning framework using lazy execution?

A

TensorFlow 1.

23
Q

What is an example of a deep learning framework using eager execution?

A

PyTorch and TensorFlow 2.

24
Q

What does a function in deep learning consist of?

A

A forward pass (computing outputs) and a backward pass (computing gradients).

25
Q

What is gradient descent used for?

A

To update neural network weights based on computed gradients.

26
Q

What happens if the learning rate is too high in gradient descent?

A

The model may fail to converge.

27
Q

What happens if the learning rate is too low in gradient descent?

A

The model may take too long to learn.

28
Q

Why are tensors useful for deep learning computations?

A

They allow for efficient parallelization and computation on GPUs.

29
Q

What is the main takeaway from automatic differentiation?

A

It enables efficient gradient computation, simplifying neural network training.