Understanding CNNs Flashcards

1
Q

What is a CNN?

A

A type of deep neural network designed for visual data, inspired by the human visual system.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Key Applications of CNNs

A

Used in facial recognition, object detection, medical imaging, self-driving cars, etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is Convolution?

A

A mathematical operation applying a small matrix (kernel) to an image to extract features.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What do CNN filters detect?

A

Edges, textures, and objects.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How does convolution work?

A

A kernel slides over an image, performing element-wise multiplication and summation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is stride in CNNs?

A

The step size by which the kernel moves across the input image.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is padding in CNNs?

A

padding=’valid’: No padding, reducing output size.

padding=’same’: Pads input to keep the output size the same.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Layers in a CNN

A

Convolutional Layer: Extracts feature maps from images.

Pooling Layer: Reduces spatial size (downsampling).

Activation Functions: Introduces non-linearity.

Fully Connected Layer: Final classification.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Pooling in CNNs

A

Purpose: Reduces dimensionality while preserving important features.

Types: Max pooling (common) and average pooling.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Fully Connected Layer

A

Flattens the feature map into a vector for classification.

Uses activation functions like softmax.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How do CNNs learn?

A

They process images with filters to extract patterns.

Use loss function to measure prediction error.

Use backpropagation to adjust filter weights.

Use optimization algorithms like SGD and Adam.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is Backpropagation?

A

A method to compute the gradient of the loss function and adjust weights accordingly.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Example: A Simple CNN

Input: 32×32 image
Conv Layer 1: 3×3 kernel, 16 filters → (30×30×16)
Conv Layer 2: 3×3 kernel, 32 filters → (28×28×32)
Fully Connected Layer: 25,088 input units → 10 output classes.

Find the total parameters calculation

A

Structure

Conv Layer 1: 144 learnable parameters.
Conv Layer 2: 4,608 learnable parameters.
Fully Connected Layer: 250,880 learnable parameters.
Total = 255,632 parameters.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Who developed LeNet?

A

Yann LeCun.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What was LeNet used for?

A

Recognizing handwritten MNIST digits.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Key Features of LeNet

A

Alternates between convolution and max-pooling layers.
Uses local receptive fields with shared weights.
Ends with fully connected layers and softmax.

17
Q

Why is Batch Normalization needed?

A

Prevents vanishing/exploding gradients.
Stabilizes training.
Reduces sensitivity to weight initialization.

18
Q

How does Batch Normalization work?

A

Normalizes each layer’s output.
Acts as a regularizer, reducing overfitting.

19
Q

How to improve CNN generalization?

A

Spatial Dropout: Prevents over-reliance on certain features.

Hyperparameter optimization: Adjusting learning rates, batch sizes, etc.

20
Q

Key similarities and differences between backpropagation and regularization

A

Backpropagation aims to minimize loss by updating weights based on error gradients.

Regularization modifies the weight update process to improve generalization and prevent overfitting.

Both affect weight updates.

Both are iterative processes.

Regularization can be incorporated into backpropagation (e.g., L2 regularization adds a penalty term to the weight updates).

21
Q

Feature map size formula

A

(input size/kernel size)/stride + 1

22
Q

Pooling size formula

A

input size/kernel size