ML-04 - Neural network Flashcards
ML-04-Neural network
Why use ANN over polynomial regression?
For a large number of features, polynomial regression gets too big.
Ex:
1,000 raw features with quadratic features x_I^2 will have ~1,000,000 input features. Scales even worse for cubic+.
ML-04-Neural network
Where is the dendrite(s) located? (See image)
(See image)
ML-04-Neural network
Where is the nucleus located? (See image)
(See image)
ML-04-Neural network
Where is the axon located? (See image)
(See image)
ML-04-Neural network
Where are the input wires located? (See image)
(See image)
ML-04-Neural network
Where is the cell body located? (See image)
(See image)
ML-04-Neural network
Where is the output wire located? (See image)
(See image)
ML-04-Neural network
Where is the node of ranvier located? (See image)
(See image)
ML-04-Neural network
Where is the axon terminal located? (See image)
(See image)
ML-04-Neural network
Where is the myelin sheath located? (See image)
(See image)
ML-04-Neural network
Where is the Scwann cell located? (See image)
(See image)
ML-04-Neural network
What is the difference between a NN and the perceptron?
- Perceptron uses step function
- Perceptron outputs are binary, i.e. in {0, 1}.
- NN can use other activation functions.
- NN outputs are real values, often in [0, 1] or [-1, 1]
ML-04-Neural network
What notation would you use to denote which layer a weight belongs to?
w_3^(1) = first layer, 3rd neuron
This connects input values (layer 1) with the 2nd layer.
(See image)
ML-04-Neural network
What notation would you use to denote the 3rd neuron’s activation in the 2nd layer?
a_3^(2) = first layer, 3rd neuron
(See image)
ML-04-Neural network
Describe what the numbers mean in this picture (See image)
- Red is the layer
- Blue is which neuron in the layer it is or which neuron the weight belongs to.
- Green is which specific weight it is.