Lecture 18- Build a neural network for language Flashcards

1
Q

What do we need to build a neural network?

A
  • Something that can activate (fire) with a threshold.
  • Inputs and outputs to that something
  • These inputs and outputs are the connections
  • The connections can be excitatory or inhibitory
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How do you build into a model the concept of neural firing being excitatory or inhibitory?

A
  • Weighted connections

- Range from -1 (inhibitory) to +1 (excitatory)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the model more commonly used in neural networks as opposed to all or none threshold activation?

A

-A sigmoid (gradual) curve

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How do you work out how much input something is getting?

A
  • For each neuron firing to it times by weighting
  • Add all of this up
  • This gives you the output (if it has reached threshold
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the simplest way we might model language with a neuron network? What is this method called?

A
  • One input/ neuron= One word

- A localised representation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a more realistic way we might model language with a neuron network? What is this method called?

A
  • One input/neuron= one letter
  • So a word means spelling out the letters in the word
  • Called a distributed representation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are the steps to ‘teach’ a neural network to get words from letters?

A

1: Activate the input
2: Get some random output
3: Trigger an error
4: Adjust weights so connections to correct answer get more weight and connections to incorrect ones get less.
5: Repeat training over and over (thousands) so network learns lots of words

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Who where the researchers who did lots of research in this area and what did they teach their network to do?

A
  • Rumelhart & McLellan
  • Input was the word and the goal was to teach the network to give the past tense
  • Follow the exact same process as before
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

In Rumelhart and McLellan’s neural network what problem presents with verbs like Kiss?

A

Network too simple to have two S

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

After a while (many repetitions) what does the network learn to do instead of just memorizing?

A
  • Learn the pattern
  • For the verbs this means that it automatically puts ed in as it knows that this is the past tense in English
  • Can test that the network has learnt the pattern by entering in a new word that has not been trained on, should be able to make it into the past tense without being taught explicitly
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What problem presents when the pattern for making verbs into the past tense has been learnt?

A
  • There are irregular verbs that don’t follow the rule
  • The network will get these wrong due to over generalizing
  • Rumelhart and McLellan found it took longer for the network to learn these verbs with accuracy (have to memorize them one by one)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly