Lecture 11 - Neural Networks Flashcards
What are the purposes for language modelling and natural language processing?
- Search
- Query and answering
- Translation
- Sentence generator
- Speech recognition
- Topics/Summaries
What is the XOR problem?
Simple unit (Perceptron) has a binary output. Cannot compute XOR (Complex interactions)
What is one solution to the XOR problem?
Neural Networks due to their hidden layer and non-linear activation function
What is meant by activation function?
Neural networks take an activation function, which the values and weights modify -> Basically a graph that gets modified
What is meant by a feed-forward network?
A network that feeds information forward - I.e only goes one way (not recurrent)
If you have multiple neurons, are the activation function applied to each neuron separately?
Yes
What are the three layers of neural networks?
Input Layer
Hidden Layer (Could have multiple)
Output Layer
How do we adjust weights?
Backpropagation
Why would we need to adjust weights?
To fix ‘error’ (i.e computed output != correct output)
True or False: Backpropagation uses Gradient Descent to fix error
TRUE
True or False: Backpropagation is a heavy process
TRUE
True or False: Error is computed over all output nodes
TRUE
In a hidden layer, we do not have a target output value - So how do we compute the error?
We compute how much each node contributed to downstream error
How do neural networks initialize weights?
Randomly -> Uniformly from interval
What are the advantages of Neural Networks?
- Neural Models don’t need any smoothing
- Provide higher accuracy than N-gram models
- XOR solution: Complex Interactions