Lesson ? - Revision for Assessment 1 Flashcards

1
Q

When was the Gestation and Birth Period of AI

Lesson 3

A

1943-1956

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

When were the Golden Early Years

A

1956-1969

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

When was the First AI winter?

A

1966-73

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

When was the rise of Knowledge-based and Expert Systems?

A

1969-89

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

When was the start of the rise of New paradigms

continuing until now

A

1986

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

When was the start of the birth of the Scientific Method, Big Data and Deep Learning?

Continuing until the present day

A

1987

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

what was Pitts-McCulloch’s paper about

A

the first description of a neural network

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Marvin Minsky was a student of who?

A

Pitts and McCulloch

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Who built the first neural net machine?

A

Marvin Minsky and Dean Edmonds

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

name of McCulloch-Pitts 1943 paper

A

A Logical Calculus of Ideas Immanent in Nervous Activity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

1st work in AI

A

McCulloch Pitts 1943, A Logical Calculus of Ideas Immament in Nervous Activity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Who is credited with creating/popularising formal propositional logic

A

Russell and Whitehead

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

neurons

A

electrically excitable cells that process and transmit info through electrical and chemical signals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

3 parts of the artificial neuron

A

Dendrite, Soma, Axon

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

phi is a ______ function

A

step-like

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

what is fed into a neuron

A

inputs with associated weights

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

what does the neuron do with the inputs and their associated weights

A

computes the weighted sum and passes it through a NON-LINEAR TRANSFER FUNCTION

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

McCulloch and Pitts showed that

A

all logical connectives/any computable function could be computed by a neural network

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Donald Hebb

A

proposed an updating rule for modifying connection weights in artificial neurons allowing a network to be trained

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Hebbian Learning

A

when neurons fire together, they adapt and form stronger connections with each other through repeated use

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

name and date of first neural net computer

A

SNARC, 1951

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Who built the SNARC

A

Marvin Minsky and Dean Edmonds

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

SNARC was used to

A

model the behaviour of a rat in a maze searching for food

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

neural nets + GPUs =

A

deep learning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
the 40 neurons in SNARC were simulated by
3000 vacuum tubes
26
Logic Theorist written by
Herbert Simon, Allen Newell, JC Shaw
27
Logic Theorist was called the first...
AI Program
28
Logic Theorist was created when?
1955-6
29
why was Logic Theorist labelled as the 1st AI program
the first program deliberated engineered to mimic the problem solving skills of a human
30
Simon, Newell and Shaw proved how many of Russell/Whitehead's theorems?
38
31
Name of Russell and Whitehead's paper
Principia Mathematica
32
How many theorems where in Whitehead and Russell's Principia Mathematica
52
33
what did Shaw/Herbert/Newell do with the final of Whitehead/Russell's theorems
create a shorter version
34
axiom
self-evident truth
35
Logic Theorist: Given some... (APT)
Axioms (A), theorems previously proved (P) and a candidate theorem to prove
36
how many methods were applied by the Logic Theorist
4
37
the methods applied by the Logic Theorist
substitution, detachment, chaining forward, chaining backward
38
what is substitution (Logic Theorist)
change logic expression (e.g. T) into a logically equivalent one (e.g. axiom in A) by substition of varaibles/replacements of connectives
39
detachment is also known as
modus ponens
40
detachment in Logic Theorist
create sub-goals to prove expressions (e.g. S and S -> T in order to prove T)
41
chaining forward in Logic Theorist
subgoals : A -> C then A -> B and B-> C
42
chaining backward in Logic Theorist
subgoals: A -> C then B-> C and A -> B
43
3 new concepts introduced using logic theorist
reasoning as search, heuristics and list processing
44
reasoning as search
proof viewed as a search starting from a hypothesis root node
45
explain the mechanism of proof as defined by "reasoning as search"
expand the proof along different branches according to deductive rules and stopping when the proposition to be proved is obtained
46
heuristics (as introduced by Logic Theorist)
proof tree grows exponentially thus some branches need to be pruned using rules of thumb/heuristics to limit search space while (hopefully) not losing the path to the solution
47
List processing
the list processing programming language IPL
48
IPL
the list processing language implemented by the authors in Logic Theorist
49
IPL serves as the basis for which language
LISP
50
who created LISP
John McCarthy
51
at non-root nodes, state = ?
{hypothesis + derived propositions}
52
new states are obtained from old staets by...
applying deductive rules
53
objective of Dartmouth Conference (quote)
every aspect of learning/any other feature of intelligence can be so precisely described that a machine can simulate it
54
The Darmouth Conference wanted to make an attempt to find out how to make a machine do what 4 things? (LAPI)
use language, form abstractions/concepts, solve problems reserved for humans, improve themselves
55
which 4 researchers proposed the Conference
McCarthy, Minksy, Rochester, Shannon (John, Marvin, Nathaniel, Claude)
56
the Conference was attended by who else?
Solomonoff, Selfridge, More, Samuel, Simon and Newell (Ray, Oliver, Trenchard, Arthur, Herbert, Allen)
57
what is considered the birth point of AI
the Dartmouth Conference
58
the GEY was marked by
overoptimism
59
which 3 computer scientists were particularly overoptimistic
Simon, Newell and Minsky
60
who funded labs and which did they fund?
ARPA (Advanced Research Projects Agency) - Stanford, MIT
61
GPS
General Problem Solver
62
who created General Problem Solver and when?
1959 - Simon and Newell
63
the GPS was meant to reason...
in a human-like way to solve any formalised symbolic problem
64
means end analysis - aspect 1
given a current state and goal state, attempt to reduce differenec between the two
65
means end analysis - aspect 2
operations (from current state) + their outputs -> creates subgoals to reduce distance to overall goal as much as possible
66
when did Newell and Simon formulate the physical symbol system hypothesis
1976
67
what led to Newell and Simon formulating the physical symbol system hypothesis
success of GPS and related programs as models of cognition
68
physical symbol system hypothesis
a physical symbol system has the NECESSARY and SUFFICIENT means for general intelligent action
69
physical symbol system
takes symbols, combines them into expressions, and manipulates them to produce new expressions
70
the physical symbol system implies that...
any intelligent system must operate by manipulating symbols
71
True/False: the implications of the physical symbol system are disputed
True
72
who made Geometry Theorem Prover?
Gelernter (1959)
73
Who made Advice Taker
McCarthy (1958)
74
which reasoned with general knowledge more broadly, Advice Taker or General Problem Solver?
Advice Taker
75
how could Advice Taker adapt to a new domain without reprogramming?
by separating explicit representation of world knowledge from deductive reasoning engine
76
who discovered Resolution Theorem Proving?
J.A Robinson 1965
77
what was resolution theorem proving?
complete theorem proving algorithm for 1st order logic
78
resolution theorem proving underlies which programming language?
PROLOG
79
Who/when created the Checkers programs?
Samuel
80
what did Samuel's checkers programs demonstrate?
the potential of computers for non-numerical tasks/AI
81
what are the core ideas of alpha-beta pruning and minimax search strategy? as introduced by the Samuel's checkers programs
search space too big to search exhaustively
82
what was the first machine learning program
Samuel's checkers program
83
why was Samuel's checkers program considered the first machine learning program
it learned from itself
84
what was the first computer game to reach the level of a respectable amateur?
checkers
85
why are so many programs limited to working in limited domains
unconstrained natural language is too difficult
86
who created STUDENT and when
Bobrow (1967)
87
who created SHRDLU and when?
T Winograd 1968-72
88
first chatbot & created when
eliza, 1964-6
89
who implemented PARRY and when
Kenneth Colby in 1972
90
which is more sophisticated? eliza or parry
parry
91
block worlds occurred primarily in which 3 domains
NLP, computer vision and robotics
92
SHAKEY : developed when and where
Stanford 1966-72
93
results of the SHAKEY project (3)
A* search algorithm, Hough transform (finding simple shape in images), Visibility graph method (used in robot motion planning)
94
SHAKEY
First general -purpose mobile robot to be able to reason about its own actions
95
SHAKEY used what to plan?
STRIPS
96
STRIPS requires what 3 things
intial state, goal state and actions
97
STRIPS develops...
a plan move from initital state to goal state by achieving appropriate subgoals
98
in 1976, the world's fastest supercomptuter was capable of how many MIPS
1000
99
today, computer vision application require how many MIPs?
10^4 to 10^6 MIPS
100
combinatorial explosion
(N) (sum) (n=0) k^n where k is the number of potential moves and N is the number of steps ahead that needs to be examined
101
Moravec's Paradox
high-level reasoning requires little computation effort but basic sensorimotor skills (e.g. perception/movement) are incredibly difficult to replicate
102
Who created the perceptron?
Frank Rosenblatt (1957)
103
the book Perceptrons was published by who when?
Minsky and Papert in 1969
104
The Frame Problem
actions change specific things but everything else - "the frame" remains the same
105
a single layer perceptron could not compute which function
the XOR
106
Single layer perceptrons could only solve....
linearly seperable classification problems
107
the qualification problem
the difficulty in specifying all possible pre-conditions for an action
108
US ALPAC Report 1966
withdrawal of funding
109
ALPAC purpose
appointed to evaluate progress in computational linguistics esp MT
110
UK Lighthill Report 1973
AI researchers had failed to address the issue of combinatorial explosion
111
US DARPA cancelled what at Carnegie Mellon University
Speech Understanding Research Programme
112
the US N______ R_______ ______ also cancelled funding
National Research Council
113
JR Lucas (1961) essentially reiterated
Turing's anticipated Mathematical Objection
114
Godel Incompleteness Theorem
any formal system has true statements it cannot prove within the system itself
115
What did Lucas say about Godel's Incompleteness Theorem?
a human can recognise the truth of these unprovable statements but a computer cannot
116
according to Russell and Norvig, Godel's argument only applies to
TMs since computers are only approximations of TMs (no infinite memory)
117
Russell and Norvig's 3 arguments to Lucas' arguement
GIT only applies to systems powerful enough such as TMs, sentences which a given cannot consistently assert while others can, cannot claim that humans aren't subject to GIT
118
2 books published by Hubert Dreyfus (and when?)
What Computers Cant Do (1972) and What Computers Still Cant Do (1992)
119
Dreyfus advocated for which of Turing's arguements
The Argument from Informality
120
Argument from Informality
the human mind is not constrained by rigid, formal rules like a machine
121
GOFAI
good old fashioned AI - all intelligent behaviour can be modelled by a system that reasons logically from a set of facts and rules
122
DENDRAL developeed by who, when and where
Buchanan, Feigenbaum and Lederberg at Stanford
123
one of the earliest expert systems
DENDRAL
124
2 inputs to DENDRAL
mass spectometry data and the chemical compound of a molcule
125
DENDRAL was powerful because what had been mapped?
theoretical knowledge needed to solve the problem
126
DENDRAL marked a shift from
general reaonsing (weak methods) over axioms to experts' rules that chunk large amounts of knowledge in the domain into specific rules
127
DENDRAL was the first k____-i_____ system
knowledge-intensive
128
knowledge intensive system
expertise stemmed from a large number of special purpose rules
129
MYCIN (1972)
diagnose blood infections and recommended antibiotics/dosages
130
True/False: Did MYCIN outperform junior doctors?
yes
131
2 differences between DENDRAL and MYCIN
no general theoretical model from which MYCIN rules could be deduced and rules had to reflect uncertainty associated with medical knowledge
132
what method did MYCIN use to address uncertainty and what COULD it have used?
used certainty factors, rather than Bayesian statistics
133
why was MYCIN never deployed
ethical concerns and lack of electronic medical systems to integrate into
134
knowledge acquisition bottleneck
the difficulty of acquiring and encoding expert knowledge into an AI system
135
who argued about scripts?
Roger Schank
136
what was Roger Schanks' emphasis when he was arguing about scripts?
less emphasis on language itself and more on representing/reasoning with knowledge required for language understanding
137
scripts
representations of stereotypical situations which are used to interpret stories about such situations
138
Schank's Script Applier Mechanism Program
could answer based on text inference and world knowledge
139
scripts influenced which subdomain of applied NLP
information extracting/text mining
140
frame
collections of facts, procedures and default values for an object type
141
KL-One is an example of a
knowledge representation language
142
why did knowledge representation languages emerge
to support the organisation of hierarchies of frames and inference over them
143
knowledge representation languages were antecendants of what
OOP and description logics/ontologies/the semantic web
144
knowledge based systems comprise which 2 sub-systems
knowledge base + inference engine
145
inference engine
applies rules to a knowledge base to derive new facts/solve problems
146
inference engine works over _____ -rules
if-then (implication)
147
inference engines may use ....
forward or backward chaining
148
expert systems
what KBSs were called in the business world
149
first commercially successful KBS
R1 (later XCON)
150
what did R1/XCON do?
configure newly ordered computers
151
number of rules inR1/XCON
2500 rules
152
True/False: R1/XCON was 95-98% accurate
true
153
True/False: By 1998, most major US corps had Expert systems
True
154
When was the return of research funding?
1980s
155
how much did the UK govt put into the Alvey Project on IT
350 mil
156
Good old Fashioned AI (LKRR)
early AI's emphasis on logic, knowledge representation and reasoning
157
3 paradigms developed in the mid 80s
Connectionism, intelligent agents, embodied/situated AI
158
Connectionism revived...
neural network research
159
key feature of Hopefield nets
recurrent - output from one unit can be fed back into itself via other units
160
recurrent neural nets effectively have
internal memory
161
recurrent neural networks can exhibit ....
dynamic temporal behaviour
162
hopefield nets are best suited for
unsegmented, connected handwriting tasks
163
RNNs use their internal memory to process...
arbitrary sequences of inputs
164
back propagation
sends the error at the output back thru the net to adjust the weights of connections between neurons
165
what would the 'error' be in terms of back propagation?
the difference between the network's prediction and the actual result
166
who popularised back-propagation
Rumelhart and McClelland in Parallel and Distributed Processing (1986)
167
back propagation led to interest in...
deep learning
168
embodied ai
To show real intelliegence, a machine needs to have a body (perception of the real world)
169
which human ability does the embodied Ai approach claim is the least important?
abstract reasoning
170
which human ability does the embodied Ai approach claim is the most important?
commonsense reasoning
171
172
who directly attacked the physical symbol system hypothesis?
Rodney Brooks in "Elephants Don't Play Chess
173
intelligent agent
an autonomous entity which observes through sensors and acts upon an environment using actuators and directs its activity towards achieveing goals
174
the focus of an intelligent agent is on...
integrating skills (learning, language and vision) into a single entity that can perceive and act in an uncertain, dynamic environment
175
internet bots are a type of...
intelligent agent
176
3 principal paradigms
supervised, unsupervised and reinforcement learning
177
supervised learning
labelled data where the