Chapter 2: The Math of Neural Networks Flashcards

1
Q

core building block of neural network

A

layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Neural layers do what?

A

extract representations out of the data fed to them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Chaining together successive neural layers creates what?

A

Progressive Data distillation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Dense Neural layer

A

Fully connected neural layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Parts of the compilation step

A
  1. loss function 2. an optimizer 3. metrics to monitor during data and training
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

multidimensional NumPy arrays also called what?

A

Tensors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are tensors

A

Tensors are a generalization of matrices to an arbitrary number of dimensions. A container for data almost always numerical

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Alternate name for dimension in the context of tensors?

A

Axis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Term for the number of axes of a tensor?

A

its rank

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Tensor that contains only 1 number

A

Scalar

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

An array of numbers

A

Vector

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

1D Tensor

A

Vector

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

0D Tensor

A

Scalar

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How many Axes do vectors have?

A

1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a vector’s dimension?

A

the number of entries along its axis. NOT the same as tensor dimension

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

An array of vectors

A

Matrix

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

2D Tensor?

A

Matrix

18
Q

3 Key attributes of a Tensor

A
  1. Number of axes
  2. Shape
  3. Data type
19
Q

Tensor Shape

A

A tuple of numbers that describes how many dimensions it has along each axis

20
Q

Tensor Axis convention

A
Axis = 0 Sample axis
Axis = 1 Feature axis
Axis = 2 Time axis
21
Q

Element-wise operations

A

applied independently to each entry the tensors

22
Q

vectorized implementations

A

amenable to massively parallel implementation

23
Q

Geometric Interpretation of Deep Learning

A

Neural networks consist entirely of chains of tensor operations and all of these tensor operations are just geometric transformations of the input data

24
Q

Differentiable

A

Can be derived

25
Q

Gradient

A

The derivative of a tensor operation. the generalization of the concepts of derivatives to functions of multidimensional inputs

26
Q

Gradient and Loss reduction

A

by shifting weights slightly in the opposite direction of the gradient

27
Q

mini-batchStochastic gradient descent

A

the process of using random batches of a data set to train the weights of a neural network.

28
Q

batch stochastic gradient descent

A

using all of your data to train your neural network via stochastic g. d.

29
Q

What role does momentum play in SGD?

A

It addresses two issues 1. Convergence Speed 2. local minima

30
Q

What gives rise to the backpropagation algorithm?

A

applying the chain rule to the computation of the gradient values in a neural network allowing us to find the derivative

31
Q

backpropagation

A

starts with the final loss value and works backward from the top layers to the bottom layers applying the chain rule to compute the contribution that each parameter had in the loss value

32
Q

Where does the knowledge of network lie?

A

in the weight tensors (attributes of the layers)

33
Q

Dense neural network layers

A

Fully connected

34
Q

layer compatability

A

The idea that every layer will only accept input tensors of a certain shape and will return output tensors of a certain shape

35
Q

network space

A

the network space you choose constrains your space of possibilities (hypothesis space)

36
Q

Optimizer

A

determines how the network will be updated based of the loss function. It implements a specific variant of stochastic gradient descent

37
Q

Keras backend engines

A

currently just Tensorflow, Theano, and Microsoft Cognitive toolkit (CNTK)

38
Q

Keras and hardware

A

Can run on either CPU or GPU

39
Q

Keras on CPU

A

Tensorflow uses a library for tensor operations Eigen

40
Q

Keras on GPU

A

Tensorflow uses a library for deep-learning op. called NVIDIA CUDA Deep Neural Network Library

41
Q

Ways to define neural model

A

Sequential or Functional API