Neural Networks Flashcards
An AI Neural Network Is an ___ paradigm
It is inspired by the way biological nervous systems, such has the brain, ___ information
information processing
process
AI NN learn by ___, like people
Learning in biological systems involves adjustments to the synaptic ___ that exist between the ___
example
connections
neurones
NN derive ___ from ___ and ___ data
meaning
complicated
imprecise
NN characteristics 1- \_\_\_ 2- \_\_\_ 3- \_\_\_ 4- \_\_\_
A. Adaptive Learning
S. Self-Organization
R. Real Time Operation
F. Fault Tolerance
A Sopa RreFeceu
A Perceptron is a simple model that consists of a single trainable ___
It receives several ___ and its ___ and has a ___ T (real value)
neuron
inputs
weights
Threshold
To train a Perceptron we give ___ and the ___, then we give him ___ and tell him if he got it right or wrong
inputs
desired outputs
examples
What if the Perceptron has the wrong output?
If the Desired Output is 0, we showld decrease the weights
If the Desired Output is 1, we showld increase the weights
The decrease in weights of an edge should be ___ to the input through that edge
Meaning that if an input is really high then it should be accountable for ___ of the error of the output
directly proportional
most
Can a Perceptron solve problems that are not linearly separable?
No
What algorithm do we use to train a MLP?
Backpropagation Algorithm
In Backpropagation Algorithm we need to follow the following steps:
1- ___
2- For each training example do a ___
3- Obtain ___ by comparing the result with the ___
4- Do a ___
5- if loss ___ ℇ, or if loss is still ___ at a reasonable rate, go to 2
1- Initialization 2- forward propagation 3- loss (error) / desired output 4- backward propagation 5- >= /decreasing
True or false
The gradient descent method involves calculating the derivative of the loss error function with respect to the weights of the network
True
Can we solve any problem with a single hidden layer?
Yes
In Hidden Layers:
1- Too few neurons can lead to ___ as there are not enough to capture the problem ___
2- Too much neurons can lead to ___ as the information in the training set is not enough to ___ all neurons in the hiiden layers
Also there is an ___ increase in training time
1- Underfitting / intricate
2- Overfittin /train / Exponential
The porpose of the Activation Function is to introduce ___ to artificial NN
nonlinear real-world properties
What are the most common Activation Functions on MLPs?
S. Sigmoid
T. Tanh
R. Relu
SToR
1- Large learning rates result in ___ training and ___ results
2- Tiny learning rates ___ the training process and might result in a ___ to train
1- unstable / non-optimal
2- lengthen / failure
What Hyperparameters exist on MLPs?
Inputs Outputs Hidden Layers Activation Function Learning Rate
What are the typical values for the Learning Rate?
0.01 to 0.1
What are Epocs?
One Epoch is one run over the whole training set
NN extract ___ and detect ___ that are too ___ for humans or other techs
patterns
trends
complex
NN can perform tasks that are ___ to humans but ___ for other techs (like )
trivial
difficult
handwriting recognition
In a Perceptron If the ___ of the ___ multiplied by the respective ___ is greater than ___, then the Output is ___, and ___ otherwise
sum inputs weight T 1 0