Neural Network Flashcards
What is a Neural Network?
A computational model inspired by the human brain that consists of interconnected layers of nodes (neurons).
What is Deep Learning?
A subset of machine learning that uses deep neural networks with multiple layers for complex pattern recognition.
Why has Deep Learning become popular now?
Better algorithms, increased computing power (GPUs, TPUs), and availability of large labeled datasets.
What are the main components of a Neural Network?
Input layer, hidden layers, output layer, weights, biases, and activation functions.
What is the role of the Input Layer?
Receives raw data for processing.
What are Hidden Layers?
Intermediate layers between input and output that perform feature extraction and transformations.
What is the role of the Output Layer?
Produces final predictions or classifications.
What is a weighted sum in a neural unit?
A combination of inputs multiplied by their respective weights plus a bias term.
What is the formula for a weighted sum in a neural unit?
Z = b + Σ(wi * xi), where Z is the weighted sum, wi are weights, xi are inputs, and b is the bias.
What is an activation function?
A mathematical function that introduces non-linearity into the neural network, allowing it to learn complex patterns.
What is the Sigmoid activation function?
A function that maps input values to the range [0,1], useful for binary classification.
What is the formula for the Sigmoid function?
y = 1 / (1 + e^(-z)).
What is the Tanh activation function?
A function similar to Sigmoid but outputs values in the range [-1,1], providing better gradient flow.
What is the ReLU activation function?
A function that outputs x if x > 0, otherwise outputs 0, helping mitigate the vanishing gradient problem.
What is a Feedforward Neural Network?
A neural network where connections move in one direction from input to output without cycles.
What is a Multi-Layer Perceptron (MLP)?
A type of feedforward neural network with one or more hidden layers.
What is the Loss Function?
A function that quantifies how well the neural network’s predictions match the actual values.
What is Gradient Descent?
An optimization algorithm used to adjust weights by minimizing the loss function.
What is Backpropagation?
A process where gradients are calculated using the chain rule and propagated backward to update weights.
What is the Vanishing Gradient Problem?
A situation where small gradients cause slow or no learning in deep networks, especially with Sigmoid activation.
What is a Convolutional Neural Network (CNN)?
A neural network specialized for processing image data by using convolutional layers.
What is a Recurrent Neural Network (RNN)?
A neural network designed for sequence-based data, where past information is retained in hidden states.
What is Long Short-Term Memory (LSTM)?
A type of RNN designed to overcome the vanishing gradient problem and remember long-term dependencies.
What are common applications of Neural Networks?
Speech-to-text, image recognition, natural language processing, and generative models.
What is Sentiment Analysis using Neural Networks?
A classification task where a neural network predicts whether a text expresses positive or negative sentiment.
What is the role of embeddings in neural networks?
They represent words or features in dense vector space, improving learning in NLP tasks.
What is the Softmax function?
A function that converts logits into probabilities for multi-class classification tasks.
What is Cross-Entropy Loss?
A loss function used in classification tasks to measure the difference between predicted probabilities and true labels.
What is Overfitting in Neural Networks?
When a model performs well on training data but poorly on unseen data.
How can Overfitting be prevented?
Using regularization techniques such as dropout, L2 regularization, and data augmentation.