ANN, CNN Flashcards

1
Q

softmax loss role in logistic regression

A
  • translate the linear predictive value into category probability

-Imagine Zi = Wi*x + Bi is the result of linear prediction, Softmax can make Zi nonnegative by letting them become exponential, then the sum of all items is normalized, now each Oi = σi (Z) can be interpreted as the probability of data x belong to the category i, or the likelihood

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Consider a dataset that has 200 samples. These samples take 1000 epochs or 1000 turns for the dataset to pass through the model.

What is the batch size?
How many batches are there?
How many times is the model updates

A
  • batch size of 5.

This means that the model weights are updated when each of the 40 batches containing five samples passes through. Hence the model will be updated 40 times.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what is ANN

A

artificial neural network
- interconnected group of nodes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

operation of single neuron

A
  • each neuron on performs a simple operation on its input
    1. computing
    2. apply to some non-linear function to z ( y=f(z) )
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

what is CNN

A
  • a special type of ANN for images
  • heavily used for vision applications
  • extract features automatically from images
  • no need for hand engineered feature design
  • high level of generalisation for different tasks (transfer learning)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what are convolution layers

A
  • CNN is a sequence of convolution layers followed by an activation function
  • output of one layer becomes input to the next layer
  • allowing the network to learn more complex representations of the input.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

what do conv layer require

A
  • number of filters K
  • filter size F
  • stride S
  • padding
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

what is max pooling layer

A
  • used to make representation smaller and tractable
  • operates over each activation map independently
  • takes max over filter’s view
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

advantages of max pooling layer

A
  • robustness to noise
  • encodes the idea “did i find any match for the filter in the search area?”
How well did you know this?
1
Not at all
2
3
4
5
Perfectly