week 2 Flashcards
chp 2
INTRODUCTION
TO INTELLIGENT
AGENT (definition)
An intelligent agent is an autonomous entity
that perceives its environment through
sensors and acts upon that environment
using actuators to achieve specific goals.
history
1950: Turing Test
1956: First Dartmouth College Conference on AI
1970-1980: Expert systems
1974-1980: First AI winter
1980: Natural language processors
1987-1993: Second AI winter
1990: Intelligent Agents
2011: Virtual assistants
2016: Sophia
2018: BERT by Google
2020: Autonomous AI
2022: GATO by Deep Mind
2022: vehicleDRX by Algotive
CHARACTERISTICS
OF AN
INTELLIGENT
AGENT
Autonomy
Social Ability
Reactivity
Proactiveness
Learning and Adaptation
AUTONOMY (definition, importance, examples)
Definition:
- The ability of an
agent to operate
without the direct
intervention of
humans or other
systems.
Importance:
- Empower systems
to make
independent
decisions. - Essential for real-
time applications
where human
intervention can be
slow or impractical.
Examples:
- Autonomous drones
navigating
obstacles. - Self-driving cars
making lane-change
decisions.
SOCIAL ABILITY (Definition, Importance, Examples)
Definition:
- The capability of an
agent to interact
with other agents,
systems, or humans
effectively.
Importance:
- Allows agents to
gather, share, and
act upon collective
information. - Crucial for systems
where collaboration
enhances efficiency.
Examples:
- Chatbots engaging
in human-like
conversations. - Multi-agent systems
collaborating to
solve complex
problems.
REACTIVITY (Definition, Importance, examples)
Definition:
The ability of an agent to
perceive and respond to
its environment in real-
time.
Importance:
Enables agents to adapt
to changing conditions.
Vital for safety-critical
applications.
Examples:
Industrial robots adjusting
to unexpected obstacles.
Voice assistants reacting
to vocal commands.
Proactiveness (definition,importance, examples)
Definition:
- The capability of an
agent to take
initiative based on
predictive modeling
and anticipation of
future events.
Importance:
- Enables agents to be
forward-looking,
planning actions
ahead of time. - Provides
competitive
advantage in
dynamic
environments.
Examples:
- Smart thermostats
predicting user behavior to pre-adjust temperatures. - Trading bots
anticipating market
shifts and making
pre-emptive trades.
LEARNING AND ADAPTATION (definition, importance, examples)
Definition:
- The ability of an
agent to improve
its performance
over time by
learning from its
experiences and
adapting its
actions
accordingly.
Importance:
- Ensures
continuous
improvement and
refinement. - Allows agents to
remain relevant
and efficient in
evolving
scenarios.
Examples:
- Personalized
content
recommendations
based on user
history. - Adaptive AI in
gaming adjusting
to players’ skill
levels.
CRITERIA TO DESIGN
AN AGENT
PEAS(Performance, Environment, Actuators, Sensors)
example: designing an automated taxi driver
Performance measure:
safe, fast, legal, comfortable trip, maximize profits
Environment types:
-fully observable (vs partially observable)
-deterministic (vs stochastic)
-episodic (vs sequential)
-static (vs dynamic)
-discrete (vs continuous)
-single agent(vs multi-agent)
Fully Observable Environment:(give definition and example)
Partially Observable Environment:(give definition and example)
Fully Observable Environment:
* Definition: Agents can access all the states and details of the
environment at any point in time.
* Example: A game of Chess where all pieces and their positions are
visible to both players.
Partially Observable Environment:
* Definition: Agents have limited access to the states or details of
the environment. Some information might be hidden or unknown.
* Example: A card game like Poker where players can’t see each
other’s hands.
Deterministic Environment:
(give definition and example)
Stochastic Environment:
(give definition and example)
Deterministic Environment:
* Definition: Outcomes of actions are predetermined and certain.
* Icon/Image: Dominoes, symbolizing predictability.
* Example: Chess (Given a state and a move, the resulting state is always the
same).
Stochastic Environment:
* Definition: Outcomes of actions are probabilistic and can vary.
* Icon/Image: A branching or splintering arrow, symbolizing multiple possible
outcomes.
* Example: Stock market predictions (Actions or decisions can lead to various
outcomes due to many unpredictable factors).
Static Environment:
(give definition and example)
Dynamic Environment:
(give definition and example)
Static Environment:
* Definition: The environment remains unchanged until the
agent performs an action.
* Example: A puzzle game like Sudoku. The game board
remains the same until a player makes a move.
Dynamic Environment:
* Definition: The environment can change while the agent is
deliberating, either due to external factors or other agents.
* Example: Stock market trading. Prices of stocks can
fluctuate based on various factors even if a trader hasn’t
made any decisions.
Discrete Environment: (give definition and example)
Continuous Environment:
(give definition and example)
Discrete Environment:
* Definition: The environment has a finite number of distinct,
separate states or outcomes.
* Example: A chess game, where an intelligent agent has to choose
from a set number of legal moves at any given point in the game.
Continuous Environment:
* Definition: The environment can take on an infinite number of
states within a given range.
* Example: An autonomous vehicle navigating through traffic.