CNN Flashcards

1
Q

What is the problem with sigmoid activation function as we add more hidden layers? What is the solution to this?

A

Vanishing gradient: change in weights become infinitesimally small

ReLu - Rectified Linear Unit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are sigmoids mostly used for in CNNs?

A

As output transformations if we want a probability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the ReLu function maths?

A

f(x) = max(0,x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a deep NN

A

2+ layers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Why do we need DNNs?

A

To solve real world problems

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What does the term “compositional structure of natural data” mean? How do DNNs use this?

A

Real world data is inherently hierarchical

DNNs use this to classify/generate content by setting up layers of feature extractors and filters

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a CNN?

A

A network with 1+ convolutional layers

Convolutional layers consist of:
Trainable filters
Non-linear activation function
Pooling layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are filters?

A

Trainable feature extractors
N dimensional window
Pixels of this window have the activation function applied to it
Applies dot product

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is pooling?

A

Reduce size of feature map by filtering

Like filtering, uses a window to scan feature map

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Types of pooling?

A
Average = calculate average of each patch
Max = select maximum from each patch
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Why pool?

A

makes feature map easier to process for future layers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

CNN process

A
  1. Normalise input
  2. Filter input
  3. Apply activation function to filtered input
  4. Pool to reduce kernel size
  5. Repeat for every convolutional layer
How well did you know this?
1
Not at all
2
3
4
5
Perfectly