Topic 4: Neural Networks Flashcards
What are the main components of an artificial neuron?
Synaptic weights, threshold, synaptic potential, activation function, and state.
What is the primary limitation of single-layer perceptrons?
They can only classify linearly separable datasets.
What improvement does the backpropagation rule bring to multilayer perceptrons?
It allows the network to adjust weights in multiple layers to minimize error.
What is a perceptron in the context of neural networks?
A perceptron is the simplest type of artificial neuron that computes a weighted sum of its inputs and applies an activation function to produce an output.
How does a neural network learn from data?
Neural networks learn by adjusting weights and biases through backpropagation and gradient descent, minimizing a loss function.
What is the difference between shallow and deep neural networks?
Shallow networks: Have one or a few hidden layers.
Deep networks: Have many hidden layers, enabling them to learn hierarchical features.
What is overfitting in neural networks, and how can it be mitigated?
Overfitting occurs when the network performs well on training data but poorly on unseen data. It can be mitigated using techniques like regularization, dropout, and early stopping.
What is dropout in neural networks?
Dropout is a regularization technique where random neurons are ignored (dropped) during training to prevent overfitting.
What is the purpose of a loss function in a neural network?
The loss function quantifies the difference between the predicted and actual outputs, guiding the optimization process.
What is the difference between training, validation, and test sets?
Training set: Used to train the model.
Validation set: Used to tune hyperparameters and prevent overfitting.
Test set: Used to evaluate the model’s final performance.