Topic 2 Flashcards

Data Mining & Machine Learning: Introduction

1
Q

Perceptron neurons

A

A perceptron takes several binary inputs, , and produces a single binary output

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Weights

A

real numbers expressing the importance of the respective inputs to the output

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Threshold value

A

neuron’s output, or , is determined by whether the weighted sum is less than or greater than some threshold value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Layer

A

The outputs of a first layer can feed a 2nd layer, and a 3rd layer a so on, creating more nuanced, abstract decisions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Bias

A

bias = -threshold, a measure of how easy it is to get the perceptron to output a . Or to put it in more biological terms, the bias is a measure of how easy it is to get the perceptron to fire.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

NAND gate

A

Any computation can be built using NAND gates, and perceptrons implement a NAND gate.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Input layer

A

input layer perceptrons are really special units which are simply defined to output the desired values,

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Learning algorithms

A

can automatically tune the weights and biases of a network of artificial neurons. This tuning happens in response to external stimuli, without direct intervention by a programmer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Sigmoid neuron

A

more tunable than perceptrons, small changes to inputs cause small changes to outputs; also called logistic neurons

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Sigmoid function

A

sigma(z) = 1/ (1+ exp(-z))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Activation function

A

the general form of neural net functions, of which perceptrons and sigmoid neurons are examples

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Input neurons

A

the neurons making up the input layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Output neurons

A

the neuron(s) making up the output layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

hidden layer

A

layers between input and output layers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Multilayer perceptrons

A

or MLPs, another name for multiple layer networks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Feedforward neural networks

A

open loop design, all neurons feed in a single direction, no feedback loops

17
Q

Recurrent networks

A

feedback loops are possible; neurons fire for some limited duration of time, before becoming quiescent

18
Q

Cost function

A

quantifies how well algorithm is performing, goal is to minimize; also called “loss” or “objective” function

19
Q

Quadratic cost function

A

Also called Mean Square Error (MSE) function, smooth, monotonic cost functions make tuning easier than discrete functions

20
Q

Gradient descent algorithm

A

a minimization algorithm which seeks the minium by calculating derivatives (slope) and goes “downhill”

21
Q

Learning rate

A

asdf

22
Q

stochastic gradient descent

A

Much faster learning speed than full computation; select small number of randomly chosen training inputs and calculate cost and change in cost to get gradient of cost.

23
Q

learning rate

A

a small, positive parameter used to define step-size, or how quickly the algorithm can move along the gradient

24
Q

Mini-batch

A

random sample on inputs used in stochastic gradient descent

25
Q

Epoch

A

complete computation of a training set

26
Q

Validation set

A

data not used in training, to validate the algorithm hasn’t overfit and will work on unseen data

27
Q

hyper-parameters

A

parameters not directly-selected by the algorithm, e.g. learning rate

28
Q

deep-neural networks

A

many-layer structure - two or more hidden layers; a series of many layers, with early layers answering very simple and specific questions about the input image, and later layers building up a hierarchy of ever more complex and abstract concepts.