Artificial Neural Networks and Applications Flashcards

1
Q

What is a neural network?

A

a model that mimics the behavior of biological neurons in the human brain

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How does a NN calculate an output?

A

passed input through an array of neurons

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What kind of complicated problems do NN perform exceptionally well in solving?

A

text, voice, and image recognition, and NLP

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a type of connectionist model that has become a mainstay of AI and cognitive science?

A

neural networks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Where does human intelligence begin?

A

with the connections between neurons

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the three milestones in neural networks models?

A

Single layer perceptron, multi-layer perceptron, and DNN

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Who founded the single layer perceptron model and when?

A

Rosenblatt in 1957

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Who founded the multi-layer perceptron model and when?

A

Rumelhart in 1986

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Who founded the DNN model and when?

A

Hinton in 2006

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Which algorithm is a key algorithm of the single layer perceptron model?

A

the perceptron algorithm

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Which algorithm is a key algorithm of the multi-layer perceptron model?

A

backpropagation algorithm

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Which algorithm is a key algorithm of the DNN model?

A

deep learning algorithm

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the main role common to all neural networks?

A

the learning function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What types of information are large-scale multimedia information?

A

audio and video information

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

True or false: Learning and recognizing data is small and easy to handle?

A

false

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What structures comprise a neuron?

A

dendrites, synapses, axons, and terminals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is the role of a neuron?

A

sensory organ neurons, network signaling

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What are the components of a perceptron?

A

weights, bias, and activation functions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is an activation function?

A

a function that expresses the activation/deactivation of a neuron

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What are a few types of activation functions?

A

step-fn, sigmoid, tanh, and ReLU

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Why do we use activation functions?

A

we need non-linear functions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What is the function of a dendrite?

A

receives stimuli from other neurons or surroundings and transmits impulses to cell body via electrical signals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What is a synapse?

A

the junction of cells where the axons of one neuron and the dendrites of the next neuron meet

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What is an axon?

A

a branch of a neuron whose function is to transmit signals to other neurons

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
What is the function of a terminal?
receives transmitted electrical signals and secretes neurotransmitters into synapses
26
What happens to signals from sensory organs?
they pass through the brain's network of neurons and get converted into meaningful signals
27
Where is the focus of current research?
implementing learning-capable computing
28
What is the behavior of a single neuron?
n inputs -> operation* (1 or 0) -> m outputs
29
What happens within the *operation (1 or 0) function?
neuron activates a signal when signal crosses a threshold according to rules of the cell body, or it does not emit a signal if it is not activated
30
What is the function of a weight?
controls the importance of the input signal to the output
31
What is the function of a bias?
controls how easily neurons are activated
32
What is the function of an activation function?
to pass vs not to pass
33
What are adjusted through learning?
weights and biases
34
Why are weights and biases adjusted through learning?
to strengthen neural networks of relevant signals and weaken unrelated neural networks
35
True or false: The activation function used for the neural network doesn't matter?
false, you must choose the appropriate function
36
An activation function does what?
takes the sum of the inputs and calculates the output
37
When the sum of the inputs exceeds a certain threshold, what happens?
the neuron is activated
38
Neurons follow what type of activation function?
a step function
39
True or false: MLP (multilayer perceptrons) use a variety of activation functions?
true
40
True or false: All activation functions are nonlinear functions?
true
41
Why is it difficult to improve performance if you use a linear function as an activation function?
combining linear functions is the same as using one linear function so even if you have multiple layers, it's really only expressed as one layer
42
What is a multilayer perceptron?
a perceptron that has a hidden layer between the input and output layers
43
What are MLPs notable for?
being able to distinguish data that is not linearly seperable
44
What do MLPs use for training the network?
backpropagation
45
What are the limitations of a MLP?
black box model and overfitting
46
What is the black box model?
since the perceptron is a black box, sometimes it is unknown how the network makes predictions or judgements
47
What is overfitting?
if the model is too complicated or the training data is too restricted, MLPs may easily overfit the training data
48
What is forward passage regarding MLPs?
the process by which input signals are applied to the input layer unit and these input signals are propagated through the hidden layer to the output layer
49
What is a loss function?
a mathematical function that measures the difference between predicted and actual values in a machine learning model
50
How does the loss function determine the model's performance?
by comparing the distance between the prediction output and the target values
51
What does the backpropagation algorithm do?
calculates the output by calculating the forward direction given the input and then calculates the error between the actual output and the calculated output
52
Why is backpropagation useful?
by propagating this error in the reverse direction, the weights are changed in the way of reducing error
53
What is backpropagation used for?
supervised learning of artificial neural networks using gradient descent
54
What is deep learning?
the act of stacking a neural network on top of eachother
55
What has deep learning recently been applied to?
computer vision, speech recognition, NLP, social network filtering, machine translation, etc.
56
Why do we use deep learning?
to mimic the human brain and improve performance of AI
57
What algorithm cause a jumpstart in the research for deep learning and hidden layer learning?
backpropagation
58
What activation functions resolved the problem of overfitting?
dropout, ReLU
59
What activation function resolved the issue that happens when the number of hidden layers increases and the gradient value decreases?
ReLU
60
What is another limitation of MLP that has been resolved?
computing speed limits
61
What was the first neural network capable of recognizing letters?
Rosenblatt's (single-layer) Perceptron in 1957
62
What was the first neural network ever?
McCulloch-Pitts neuron in 1943
63
What was the McCulloch-Pitts neuron the basis for?
the perceptron algorithm
64
What is the function of the McCulloch-Pitts neuron?
receives one or more inputs and sums them to produce an output
65
What can the McCulloch-Pitts neuron be used for?
classification problems
66
How is Hebb's rule summarized?
neurons that fire together wire together
67
What does Hebb's rule describe?
how neuronal activities influence the connections between neurons
68
How many layers is Rosenblatt's perceptron made up of?
a single layer
69
When was the Mark I Perceptron produced and what was it?
1957, a neural network hardware device
70
What is a linear system?
a system that solves a problems where the input classes are linearly seperable
71
What was the first implementation of the perceptron algorithm?
the Mark I Perceptron
72
What was the input for the Mark I Perceptron device?
characters (ASCII)
73
What did the Mark I Perceptron do with the input?
classified the characters into characters classes (A, B, C, etc) aka character recognition
74
Where was/is the Mark I Perceptron housed?
the Smithsonian Museum, USA
75
What is the structure of the single-layer perceptron?
adjustment connection strength of just one layer, and uses McCulloch-Pitts neuron and Hebb's learning rule for feedback learning for errors
76
What is the structure of the input/output in neurons?
result of n inputs is multiplied by n connections strength vectors
77
What is the sum determined by in the neuron?
the activation function
78
What is the output of the neuron?
1 if the value is greater than the threshold (usually 0), otherwise -1
79
What are some typical non-linear activation functions used in NN?
step function, critical logic function, S-shaped sigmoid function
80
What is the activation function commonly used in perceptrons?
the sigmoid function
81
What is a key characteristic of the sigmoid function?
it had a smooth value between 0 and 1
82
Describe the perceptron learning process.
1)initialize connection strengths and thresholds, 2) present new input and expected output, 3) calculate actual output value, 4) readjust connection strength, 5) go to step 2, rinse and repeat until no more adjustments
83
Generally what is linear separability?
a property of 2 sets of points in Euclidean geometry
84
What is the definition of linear separable points?
data points in binary classification problems that can be separated using a linear decision boundary (basically points that can be divided into two areas by a straight line)
85
What binary functions are linearly separable?
AND and OR
86
What binary function is not linearly separable?
the XOR (exclusive-or function)
87
True or false: A perceptron can only converge on linearly separable data?
true
88
What is the limitation of the single-layer perceptron?
only linearly separable sets can be separated (so not XOR sets)
89
What solved the linear separability problem with the single-layer perceptron and when?
in the mid-1980s the multi-layer perceptron solved the XOR problem
90
Single-layer perceptron has how many nodes between the input matrix and the decision node?
one node
91
Is the single-layer perceptron suitable as a learning model?
no
92
What is the single-layer perceptron used widely for?
character recognition
93
What are 2 early neural network models?
Adaline (Adaptive linear neuron) and Madaline (Many Adaline)
94
Who proposed the first couple early neural networks in 1960?
Bernard Widrow and Ted Hoff
95
What are some applications of Adaline?
system modeling, statistical prediction, eliminate communication noise and echo, channel equalizer, adaptive signal processing
96
For how many years after 1969 did neural network research stagnate?
10 years
97
When was the multi-layer perceptron model proposed?
mid-1980s
98
What key NN research group is based out of Stanford?
the Parallel Distributed Processing group
99
What does the PDP focus on?
the study of cognitive processes using parallel distributed processing models
100
What are some activities of the PDP?
developing models of cognitive processes, creating software programs to simulate these models, and conducting experiments to test these models
101
What prominent book was published by PDP in 1986?
Parallel Distributed Processing: Explorations in the Microstructure of Cognition
102
What important algorithm did the PDP introduce?
backpropagation
103
When was the backpropagation algorithm proposed?
in 1986 in the multi-layer perceptron structure
104
What was the multi-layer perceptron with back propagation useful for?
overcoming limitations of single-layer perceptrons particularly the linear separability (XOR) problem
105
How does backpropagation implement learning?
returning in the opposite direction
106
What is backpropagation used for?
supervised learning of artificial neural networks using gradient descent
107
What exactly is backpropagation?
a backward propagation of errors algorithm that calculates the gradient of the error function with respect to the NN's weights
108
In backpropagation, what minimizes errors?
a backwards pass to adjust a neural network model's parameters
109
What are the steps of backpropagation in a multi-layered network?
propagate training data through the model, adjust the model weights to reduce the error, and repeatedly update weights until convergence or iteration satisfaction
110
Who was both the first author of the famous backpropagation paper in 1986 and a key member of the PDP group?
Dr. David Rumelhart
111
What was an early goal of the PDP group?
to recognize incomplete or noisy characters
112
What is the structure of the multi-layer perceptron?
one or more hidden layers between the input and output layers
113
What is the order of connections for multi-layer perceptrons?
input -> hidden -> output
114
Are there connections within each layer of a multi-layer perceptron?
no
115
True or false: There is a direct connection from the output layer to the input layer of a multi-layer perceptron?
false
116
Fill in the blank: Multi-layer perceptron makes the input and output characteristics of nodes __________.
nonlinear
117
What are the steps in the forward pass of the backpropagation learning algorithm of multi-layer perceptron?
give an input pattern to each node in input layer, signal is converted at each node and sent to hidden layer, output signal from hidden layer to output layer
118
After forward pass, what are the remaining steps of the backpropagation learning algorithm of multi-layer perceptron?
compare output to expected output, adjust connections strength, backpropagate again to adjust connection strength, repeat until end conditions are met
119
What are the end conditions for the backpropagation learning algorithm of multi-layer perceptron?
when the actual output value and the target output value are within the error range
120
What is the delta rule?
the square of the error between the output and the target output value
121
What is the gradient descent method?
the square of the error on a curved surface (an iterative optimization algorithm used to find the local minimum of a differentiable function)
122
What is the idea behind the gradient descent method?
to take repeated steps in the opposite direction of the gradient of the function at the current point because this is the direction of steepest descent (to find the minimum by moving in the direction of the negative gradient)
123
How does backpropagation use gradient descent in learning?
used to update weights of network during training by computing gradient of loss function with respect to the weights then moving the weights in opposite direction of the gradient until weights converge to a minimum
124
What are the disadvantages of the backpropagation learning algorithm?
long training time, low probability or local minimum possible, steepest descent is likely to stay at a local minimum (when we want a global minimum)
125
What did the multi-layer perceptron make possible?
A neural network that can implement the XOR function
126
What are other applications of multi-layer perceptron models?
parity problem, encoding problem, symmetry problem, development of nettalk system to convert text to voice, stock market predictions, translation between different languages, factory automation, robots, real-time voice recognition