Class 8 Flashcards

1
Q

deep learning

A

broad family of techniques for ML in which the hypotheses take the form of a complex algebraic circuit with tunable connection strengths

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

neural networks

A

networks trained by deep learning methods

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

feedforward network

A

neural network with connections only in 1 direction – forms a DAG with designated input and output nodes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

recurrent network

A

neural network that feeds its intermediate or final outputs back into its own inputs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

universal approximation theorem

A

states that a network with just 2 layers of computation, 1st = nonlinear and 2nd = linear, can approximate any continuous function to an arbitrary degree of accuracy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

activation function

A

first layer in a network, the nonlinear one

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

relu

A

rectified linear unit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

softplus

A

smooth version of ReLU

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

vanishing gradient

A

error signals are extinguished as they are propagated back through the network

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

automatic differentiation

A

applies rules of calculus in a systematic way to calculate gradients for any numeric program

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

one hot encoding

A

non-numeric attributes (think strings) given a numeric expression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

convolutional neural network

A

neural network that contains specially local connections

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

kernel

A

pattern of weights that is replicated across multiple local regions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

convolution

A

process of applying the kernel to the pixels of the image

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

stride

A

size of the step that the kernel takes across an image

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

receptive field

A

part of a neuron where the sensory input can affect the neurons activation

17
Q

pooling layer

A

layer in a neural network that summarizes a set of adjacent units from the preceding layer with a single value

18
Q

downsampling

A

process of making the stride larger – coarsens the resulting image

19
Q

tensors

A

multidimensional arrays of any dimension

20
Q

feature map

A

output activations for a given filter – created by tensors

21
Q

channels

A

dimension of the matrix information that carries information about features

22
Q

residual networks

A

neuralnetworks that avoid the problem of vanishing gradients by building a very deep network, has skip connections

23
Q

batch normalization

A

improves rate of convergence on SGD by rescaling the values generated by the internal layers of the network from examples within each minibatch

24
Q

neural architecture search

A

used to explore the state space of possible network architectures – using a neural network to find the best neural network

25
weight decay
adding a penalty to the loss function (same as regularization for a neural network)
26
dropout
technique for introducing noise at training time which forces the model to become more robust (similar to boosting), can randomly deactivate units (perceptrons)
27
recurrent neural networks
neural network that are distinct from feed forward networks in that they allow cycles in the computation graph
28
markov assumption
RNN's assume the current state is based on a finite set of previous states
29
gating units
vectors that control the flow of information in the LSTM via elementwise multiplication of the corresponding vector
30
unsupervised learning
takes a set of unlabeled examples, may try learning a new representation like specific feature or image, might try to learn a generative model
31
generator
network that maps values to produce samples from the distribution
32
discriminator
network that classifies inputs as real (from the training set) or fake (from the genrator)
33
generative adversarial network
pair of networks that combine to form a generative system
34
transfer learning
occurs when experience with one learning task helps an agent learn better on another task
35
multitask learning
form of transfer learning where we simultaneously train a model on multiple objectives
36
deep reinforcement learning
field of research on multilayer computation graphs