Lab 1 Questions Flashcards
In what year Alan Turing develops the Turing Test
1950
What year the term “artificial intelligence is first used
1956
While at the Cornell Aeronautical Laboratory, Frank Rosenblatt develops the perceptron,
the first artificial neuron, that will set the stage for the development of networks of
artificial neurons, a.k.a. neural networks
1957
Joseph Weizenbaum develops Eliza, the first chatbot. It is based on a set of re-write rules
written by hand.
1966
Marvin Minsky & Seymour Papert publish a book that shows the limits of perceptrons
and argue for more work in symbolic computation (aka Good-Old-Fashioned AI, GOFAI).
The book is often cited as the main reason for the abandoning research on neural networks
1969
The Lighthill and the ALPAC reports that showed little progress in AI, kills research
funding and leads to the first AI Winter.
early 1970s
The robot Shakey1
, programmed in LISP, resulted in the development of the A* search
algorithm.
1972
Alain Colmerauer, who was professor at the University of Montreal for a few years,
develops Prolog; a programming language based on logics that is very popular in AI to
write rule-based systems
1972
Marvin Minsky develops the Frames to reason with world-knowledge. Years later, frames
turned out to be the basis of object-oriented programming.
1974
The expert system MYCIN is developed to recognise bacterial infections and recommend
antibiotics. Its recommendations are often better than those of human experts. It is
based on a knowledge base of ≈ 600 hand-written rules (written in Lisp) and developed
in collaboration with medical doctors.
1975
The METEO rule-based machine translation system, developed at the University of Montreal, is deployed at Environment Canada to translate weather forecasts from English to
French.
1975
Expert systems, such as MYCIN, and other types of systems made of hand-written rules
are considered too expensive to maintain and to adapt to new domains. The industry
drops research in such systems. It is the 2
nd AI Winter
early 1980’s - early 1990’s
Corinna Cortes and Vladimir Vapnik develop an approach to machine learning called soft
margin Support Vector Machines (SVM), which quickly becomes one of the most popular
machine learning algorith
1993
After finishing his PhD on handwriting recognition, Yann Lecun makes public the MNIST
dataset. The dataset contains 70,000 images of handwritten digits and becomes the
benchmark to evaluate machine learning.
1998
Google launches its Google Translate service based on Statistical Machine Translation.
Translation rules are found automatically based on a statistical analysis of parallel texts
in different languages.
2006