Topic 2 Flashcards
Data Mining & Machine Learning: Introduction
Perceptron neurons
A perceptron takes several binary inputs, , and produces a single binary output
Weights
real numbers expressing the importance of the respective inputs to the output
Threshold value
neuron’s output, or , is determined by whether the weighted sum is less than or greater than some threshold value
Layer
The outputs of a first layer can feed a 2nd layer, and a 3rd layer a so on, creating more nuanced, abstract decisions.
Bias
bias = -threshold, a measure of how easy it is to get the perceptron to output a . Or to put it in more biological terms, the bias is a measure of how easy it is to get the perceptron to fire.
NAND gate
Any computation can be built using NAND gates, and perceptrons implement a NAND gate.
Input layer
input layer perceptrons are really special units which are simply defined to output the desired values,
Learning algorithms
can automatically tune the weights and biases of a network of artificial neurons. This tuning happens in response to external stimuli, without direct intervention by a programmer.
Sigmoid neuron
more tunable than perceptrons, small changes to inputs cause small changes to outputs; also called logistic neurons
Sigmoid function
sigma(z) = 1/ (1+ exp(-z))
Activation function
the general form of neural net functions, of which perceptrons and sigmoid neurons are examples
Input neurons
the neurons making up the input layer
Output neurons
the neuron(s) making up the output layer
hidden layer
layers between input and output layers
Multilayer perceptrons
or MLPs, another name for multiple layer networks