Intro to AI Flashcards
key terms & definitions
Artificial Intelligence (AI)
Broad capability to imitate human abilities.
What is AI? History/origins? Milestones?
AI involves machines mimicking human cognitive functions.
Started in 1950s with Turing Test. And in 1956 the term ‘AI’ was introduced by John McCarthy.
Milestones:
- 1997: chess-playing computer (Deep Blue) defeats chess champion Garry Kasparov
- 2011: Apple integrates Siri into iPhone
- 2014: Amazon Alexa virtual assistant
- 2020: GPT-3 tool
- 2022: ChatGPT available to public
What is the Turing Test?
An intelligent test for machines. - if a machine tricks humans into thinking it is a human, then it has intelligence.
Machine Learning (ML)
Subset of AI; algorithms improve through experience and learning.
Deep Learning (DL)
Subset of ML; uses neural networks with many layers
Generative-AI (Gen-AI)
subset of DL; generates new data and outputs (text, e.g. ChatGPT; image, e.g. Dall-E, Stable Diffusion, music)
Large Language Models (LLMs)
subset of Gen-AI; GPT-series (ie. GPT-4, Bard, Claude)
Top-Down Approach to AI
Aka - Symbolic Reasoning
- Models the way a person reasons to solve a problem; we, humans, model our reasoning process and program it into computers
- Involves extracting knowledge from a human being, representing it in a computer-readable form.
- Develop a way to model reasoning inside a computer.
- uses logic and reasoning to work
- old way/approach
Bottom-up approach to AI
aka - Neural Networks
- Learns from examples
- Models the structure of a human brain, using neurons to form understanding of patterns
- each neuron acts like a weighted avg of inputs - we can train a network of neurons to solve useful problems by training data; learning from examples.
- new approach - this is what is used now
Types of ML
- Supervised
- Unsupervised
- Reinforcement
What is supervised ML?
learning with labeled data
What is unsupervised ML?
Identifies patterns in data w/o labels
- e.g. spam emails contain certain patterns that are common
What is reinforcement ML?
Learning via feedback and rewards
What is big data?
large, complex, diverse and high quality datasets
Connection between big data and AI?
- Data as the foundation: Ai systems learn from data.
- The more data an AI system has, the better it can learn and make accurate predictions, decisions, recognise patterns, improve performance over time.
- the data must be cleaned, labelled, and prepared for use in AI models
- But not always.. Memory must be sufficient for AI to decipher big data
What is self-supervised learning?
LLMs learning - use inherent data properties and label the data themselves
List of LLMs (newest to oldest)
Newer Models:
1. OpenAI: GPT-4o (May 2024)
2. Anthropic: Claude 3.5 (March 2024)
3. Google DeepMind: Gemini (December 2023)
4. xAI: Grok-1 (November 2023)
Mistral AI: Mistral 7B (September 2023)
Original List:
Google: Bard (March 2023)
Anthropic: Claude (March 2023)
OpenAI: ChatGPT (November 2022)
Meta: OPT-IML (2022)
Baidu: Ernie 3.0 Titan (December 2021)
DeepMind: Gopher (December 2021)
NVIDIA: Megatron-Turing NLG (October 2021)
AI21 Labs: Jurassic-1 (August 2021)
Google: LaMDA (May 2021)
Basic terms for neural networks
- Neurons
- Weights
- Activation functions
- paths
- Bias
Convolutional Neural Network (CNNs)
used for image data
Recurrent Neural Networks (RNNs)
suitable for sequence data
How do neural networks work?
Each neuron acts like a weighted average of its inputs, and we can train a network of neurons to solve useful problems by providing training data.
What is the activation function?
Function that generates output based on input signal and their weights.
- uses a threshold to send info
What is Bias (B)?
= a number to calibrate the summation function; a fine tuning function that helps achieve better results
- E.g.: Facial recognition, forecasting by identifying patterns, music composition
What is Natural Language Processing (NLP)?
Making sense of and generating human language
- used in chatbots, LLMs, translators
What are the challenges to NLP?
sarcasm, context, cultural nuances
- different word, same meaning
- Different expression, same meaning
- different grammar same meaning
- same word, different context
How Dall-E 2 works?
- Image Caption: encoding - starts with textual description which is the prompt
- Text embedding: textual description converted to numerical representation
- Prior: Randomness called the “prior” is utilised to introduce diversity in generated images / noise
- Encoder: text embedding and prior are combined and processed through an encoder (basc. Mix text and numbers together)
- Decoder: encoder’s immediate representation is fed into a decoder - transforms it into a synthesised image
- Image Generation: through the decoder’s iterative process, final img is generated.