chapter 2 Flashcards

1
Q

a multilayer neural network consists of

A
  1. Layers
    of perceptron-like simulated neurons
  2. Units
    = the simulated neurons
    > Hidden unit = a non-output unit

each input has a weighted connection to each hidden unit, and each hidden unit has a weighted connection to each output unit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

deep networks

A

a multilayer network with multiple layers of hidden units; networks that have more than one layer of hidden units

The “depth” of a network is simply its number of hidden layers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what does a unit do

A

each unit multiplies each of its inputs by the weight on that input’s connection and then sums the results

unlike in a perceptron, a unit here doesn’t simply “fire” or “not fire” based on a threshold; instead, each unit uses its sum to compute a number between 0 and 1 that is called the unit’s “activation.”

these activation values then become the inputs for the output units, which then compute their own activations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

the network’s confidence

A

the activation of an output unit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

classification

A

the digit category with the highest confidence can be taken as the network’s answer - it’s classification

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

backpropagation

A

a way to take an error observed at the output units and to “propagate” the blame for that error backward so as to assign proper blame to each of the weights in the network

to determine how much to change each weight in order to reduce the error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

learning in neural networks

A

gradually modifying the weights on connections so that each output’s error gets as close to 0 as possible on all training examples.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

connectionist networks

A

In the 1980s, what we now call neural networks were then generally referred to as connectionist networks

the term connectionist refers to the idea that knowledge in these networks resides in weighted connections between units.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

According to the proponents of connectionism, the key to intelligence was

A
  1. an appropriate computational architecture—inspired by the brain
  2. the ability of the system to learn on its own from data or from acting in the world.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

good old-fashioned AI, or GOFAI

A

Machine-learning researchers disparagingly referred to symbolic AI methods as good old-fashioned AI, or GOFAI and roundly rejected them.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly