Lecture 19- Learning and Structure in Neural Networks Flashcards

1
Q

What are the two functions of neurons?

A
  • An input function

- An activation function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the architecture of a neural network?

A

How many neurons there are, and how they connect

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

List the four components of a neural network…

A
  • Neurons
  • Architecture
  • Weights
  • Training
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What components of a neural network are relatively constant in their design?

A
  • Weights

- Neurons

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are the two types of architecture what is the difference between them?

A
  • Classifier= tons of inputs, one outcome

- Transformation= lots of inputs, lots of outputs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is information compression?

A

-Squeeze data into smaller size

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are some uses of compression in terms of technology?

A

Built around unchanging pixels:
-In an image, pixels right next to each other are
often about the same
-In a movie, the next frame is usually about the
same as the last frame (especially in areas)

Built around how common patterns are:
-If something is very common, give it a very brief
shorthand
-If something is uncommon, give it a longer
shorthand

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How does compression work in terms of language?

A

Find patterns then just need to code the unique part and say add the common part.

E.g. if everything starts with “the noun”
Just transfer the noun part which will be different each time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How does compression work in terms of children learning grammar?

A

-Learn to pick up the pattern in what they are hearing i.e in the “the noun” example only a small number of words (class determiners) can go in the “the” spot

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is classifying architectures generally used for?

A

Word recognition

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

put in order…
learning
compression
pattern finding

A
  • Compression
  • Pattern finding
  • Learning
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How can you being learning once you recognize patterns?

A

Pattern finding is essentially learning as if you can find the pattern you are no longer memorizing you can take things to novel situations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What are two ways learning can go wrong?

A
  • Overfitting: A network can just memorize every
    answer. So no pattern is ever learned

-Underfitting: A network might lose too much
information and not be able to answer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How is capacity invovled in choosing an architecture?

How can this problem be fixed?

A
  • If capacity far bigger than problem, no learning can occur just memorizing.
  • Few hidden nodes requires learning
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are the two key points from this lecture?

A
  • The network structure imitates the problem to solve

- Nothing in the system indicates what it is about. This is decided from outside.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly