ml-flashcards

1
Q

What is the relation between deep learning and AI?

A

Deep learning is a subset of machine learning, which is itself a subset of AI.

Key points:
1. AI is the broader field of making machines intelligent
2. Deep learning specifically uses neural networks with multiple layers
3. Deep learning enables automatic feature extraction unlike traditional AI
4. It excels at pattern recognition and complex tasks like vision and language
5. Deep learning has driven many recent AI breakthroughs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How to define Machine Learning?

A

Machine learning is the field of computer science where systems learn from data without being explicitly programmed.

Key aspects:
1. Systems improve with experience
2. Algorithms find patterns in training data
3. Models make predictions on new, unseen data
4. Learning can be supervised, unsupervised, or reinforcement-based
5. Focus on generalization from examples rather than following fixed rules

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is overfitting and how can you avoid it?

A

Overfitting occurs when a model learns training data too precisely, including noise, leading to poor generalization.

Prevention methods:
1. Cross-validation to detect overfitting
2. Regularization techniques (L1/L2)
3. Dropout in neural networks
4. Early stopping
5. Increasing training data
6. Reducing model complexity
7. Data augmentation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a perceptron and how does it predict?

A

A perceptron is the simplest form of feedforward neural network.

Operation:
1. Receives input features multiplied by weights
2. Sums weighted inputs plus bias
3. Applies activation function to sum
4. Outputs binary classification (0 or 1)
5. Can only separate linearly separable classes
6. Acts as artificial neuron model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the relation between perceptrons and Boolean functions?

A

Perceptrons can implement basic Boolean functions:
1. AND function: Set appropriate weights and threshold
2. OR function: Lower threshold than AND
3. NOT function: Use negative weight
4. Cannot implement XOR (non-linearly separable)
5. Multiple perceptrons can implement any Boolean function
6. Forms basis for more complex neural computations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How can you optimize weights in a perceptron?

A

Perceptron weight optimization methods:
1. Perceptron learning rule
2. Gradient descent on error
3. Update weights proportional to input and error
4. Iterate until convergence or maximum iterations
5. Learning rate controls update size
6. Stop when all training examples classified correctly
7. May not converge if data not linearly separable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What role does the activation function play in natural and in machine neural networks?

A

Activation functions introduce non-linearity and determine neuron firing.

Roles:
1. In natural neurons: Controls signal propagation threshold
2. In artificial neurons: Enables complex pattern learning
3. Common functions: ReLU, sigmoid, tanh
4. Affects gradient flow during training
5. Different functions suit different tasks
6. Enables network to learn non-linear relationships

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the limitations of a single perceptron?

A

Single perceptron limitations:
1. Can only learn linearly separable patterns
2. Cannot solve XOR problem
3. Binary output only
4. No hidden layers for feature learning
5. Limited to simple decision boundaries
6. Cannot approximate complex functions
7. No memory of previous inputs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the typical structure of an artificial neural network?

A

Typical ANN structure:
1. Input layer receiving features
2. Hidden layers for processing
3. Output layer for predictions
4. Fully connected layers between neurons
5. Weights on connections
6. Bias terms for each neuron
7. Activation functions at each layer
8. Skip connections in modern architectures

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How does backpropagation generate error signals for the intermediate layers?

A

Backpropagation process:
1. Calculate output layer error
2. Propagate error backward through network
3. Use chain rule of calculus
4. Compute gradient for each weight
5. Consider activation function derivatives
6. Distribute error responsibility to previous layers
7. Update weights based on calculated gradients

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are the key ideas of convolutional neural networks?

A

CNNs key ideas:
1. Convolutional filters for feature detection
2. Parameter sharing
3. Local connectivity
4. Pooling layers for dimensionality reduction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are the key ideas of recurrent neural networks?

A

RNNs key ideas:
1. Processing sequences of variable length
2. Memory of previous inputs
3. Hidden state maintenance
4. Variants like LSTM and GRU for long-term dependencies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly