Task 1 - Outsourced intelligence Flashcards
Amara’s law
Brooks (2017)
We tend to overestimate the effect of technology in the short run and underestimate the effect in the long run
Clark’s 3rd law
Brooks (2017)
Any sufficiently advanced technology is indistinguishable from magic
Artifical General Intelligence
Refers to a type of artificial intelligence that possesses human-like cognitive abilities, allowing it to understand, learn, and apply knowledge across a wide range of tasks at a human or superhuman level
Suitcase words
Brooks (2017)
Words that carry a variety of meaning (for example, using the word learning for AI; can create false expectations)
Exponentialism
Brooks (2017)
People assume that if tech has developed at an exponential rate, it will also do so in the future (but at some point there is a limit)
Large Language Model (LLM)
- A type of AI trained on vast amounts of text data to understand and generate human-like language using deep learning
- Basically predicts the next word/sentence given an input
Machine learning
A branch of AI that enables computers to learn patterns from data and make predictions or decisions without being explicitly programmed.
Artificial Neural Network
A computational model inspired by biological neural networks used for tasks such as pattern recognition, classification and prediction
Transformer architecture
- Used by most LLMs
- Processes entire sequences at once rather than one word at a time (faster)
How are LLMs trained?
(Pre-Training)
- Model is neural network with many parameters (settings) the model uses to make predictions
- In training, words in a sentence are seen as predictors (x in regression equation) and the missing word(s) as the outcome (y)
- After guessing, it compares to actual word and then updates parameters
Backpropagation
Mathematical technique that LLMs use after making an error, where it traces through network and determining how much each paramter contributed to error
Fine-tuning
A process that involves giving the pre-trained LLM new and more specific
training data to adjust the model’s parameters for a specific task
Prompt-tuning
Prompt tuning (or in-context learning) is the practice of adjusting the input prompt given to a pre-trained model to guide its responses in a desired direction, without changing the model’s internal parameters
Representational harms
Arise when the LLM represents some social groups in a less favorable light than others, demeans them, or fails to recognize their existence altogether
Allocational harms
Arise when AI algorithms differentially allocate resources (e.g. loans) or opportunities (e.g. therapy) to different social groups based on historically
biased decision patterns represented in the data, such as biased diagnoses or biased assignment to therapy treatment.
Probing
- Probing in the context of AI and large language models (LLMs) refers to analyzing or testing what a model has learned by examining its internal representations.
- Used to understand whether and how specific types of knowledge are encoded in the model’s hidden layers.
ELIZA
A computer program from the 1966 that was the first simulation of a psychotherapist, designed to imitate the empathic communication style of Carl Rogers.
PARRY
- A program from the 1970s that simulated a person with paranoid schizophrenia and could converse with others.
- PARRY is the first program to pass the Turing Test
Super clinician
Integrated AI technologies can provide a simulated practitioner which could have better capabilities than those of human practitioners
Expert systems
A computer program designed to incorporate the knowledge & ability of an expert in a particular domain.
Cognitive modeling
- Cognitive modeling is the process of creating computational models that simulate human thought processes.
- These models aim to understand and replicate how people perceive, reason, learn, remember, and make decisions
Moore’s law
Says that the complexity in computer circuits doubles every two years
Singularity principle
A hypothetical future point when artificial intelligence (AI) or other technologies advance to the point that they surpass human intelligence.
Temperature parameter
- Controls the randomness of the model’s output by adjusting the probability distribution over possible next words.
- It influences how deterministic or creative the model’s responses are