Bashashati Lectures Flashcards
three of cycles of AI
first tide (hype cycle)
second tide (still hype)
rise of the third tide
first computer
- in 1946, ENIAC: electronic numerical integrator & computer
- first general purpose digital computer
- powered by vacuum tubes
- built by US army (Los Alamos labs) to develop H-bomb
- prompted Alan Turing to devise a test (Turing test) to detect AI
perceptron mark I
- first artificial neural network
- by frank rosenblatt
Marvin Minsky
- in 1969
- co founder of MIT AI lab
1970-1980
winter of AI
neural network breakthrough and fallout
- 1985, people independently rediscovered backpropagation algorithm
- 1995, neural nets fell out of favor again: lack of good theory, tendency to overfit, biologically implausible
- new methods like support vector machines (SVM) become popular
2nd winter of AI
1985-1997
1997
deep blue vs Kasparov
- first computer program to defeat a world champion in a match under tournament regulations
2006
- Fei-Fei Li worked on ImageNET
- classify 1 million images into 1k categories
recent deep neural networks
2016-17: alphago
2022: Dall-E2 AI image generator
2022: ChatGPT
neurons and brain
- neurons receive signal, process it, and propagate signal
- neurons are slow
- about 100 billion neurons in the brain, and each is connected to about 100,000 other neurons
brain vs computers
brain
- recognizes faces, retrieving info, organizing info
- no CPU
- lots of slightly smart memory cells
computer
- arithmetic
- one very smart CPU
- lots of dumb memory cells
logistic vs linear regression for building decision function
input –> each input is weighted –> linear combinator –> activation function (changes based on log vs lin) –> output
building a neural network
M inputs –> each is connected to the D hidden layers (making sure D < M) –> each is connected to K outputs
- as you add more hidden layers, the system becomes increasingly complex