chapter 15 Flashcards

1
Q

Douglas Lenat’s Cyc project

A

The most famous and longest-lasting attempt to manually encode commonsense knowledge for machines

Lenat concluded that true progress in AI would require machines to have common sense

he decided to create a huge collection of facts about the world, along with the logical rules by which programs could use this collection to deduce the facts they needed.

enat’s goal was for Cyc to contain all the unwritten knowledge that humans have

 However, humanlike abstraction and analogy making are not skills that can be captured by Cyc’s massive set of facts or, I believe, by logical inference in general.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Bongard problems

A

Each problem features twelve boxes: six on the left and six on the right. The six left-hand boxes in each problem exemplify the “same” concept, the six right-hand boxes exemplify a related concept, and the two concepts perfectly distinguish the two sets. The challenge is to find the two concepts.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

challenges of bongard problems

A
  1. their solution requires abstraction and analogy-making abilities
    > abstraction and analogy are all about perceiving “the subtlety of sameness.”. > To discover this subtle sameness, you need to determine which attributes of the situation are relevant and which you can ignore
  2. ability to perceive new concepts on the fly
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

convnets and bongard problems

A

 An immediate obstacle is that a set of twelve training examples is laughably inadequate for training a ConvNet; even twelve hundred might not be sufficient.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

conceptual slippage

A

an idea at the heart of analogy making

 When you attempt to perceive the essential “sameness” of two different situations, some concepts from the first situation need to “slip”—that is, to be replaced by related concepts in the second situation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Copycat

A

program envisioned by Hofstadter

would solve problems like these by using very general algorithms, similar to those he believed humans used when making analogies in any domain

 Copycat solved analogy problems via a continual interaction between the program’s perceptual processes (that is, noticing features in a particular letter-string analogy problem) and its prior concepts (for example, letter, letter group, successor, predecessor, same, and opposite).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

metacognition

A

the ability to perceive and reflect on one’s own thinking

 Copycat, like all of the other AI programs I’ve discussed in this book, had no mechanisms for self-perception, and this hurt its performance

 The program would sometimes get stuck, trying again and again to solve a problem in the wrong way, and could never perceive that it had previously been down a similar, unsuccessful path.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Metacat

A

not only solved analogy problems in Copycat’s letter-string domain but also tried to perceive patterns in its own actions.

When the program ran, it produced a running commentary about what concepts it recognized in its own problem- solving process.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Situate

A

combines the object-recognition abilities of deep neural networks with Copycat’s active-symbol architecture, in order to recognize instances of particular situations by making analogies.

 We would like our program to be able to recognize not only straightforward examples, but also unorthodox examples that require conceptual slippages.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Copycat, Metacat, and Situate

A

three examples of several analogy-making programs that are based on Hofstadter’s active-symbol architecture.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

active-symbol architecture

A

one of many approaches in the AI community to creating programs that can make analogies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Karpathy

A

 Karpathy’s example beautifully captures the complexity of human understanding and renders with crystal clarity the magnitude of the challenge for AI.

we need embodiment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

embodiment hypothesis

A

a machine cannot attain human-level intelligence without having some kind of body that interacts with the world.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly