Agents Flashcards
Agents
Anything that can be viewed as perceiving its environment through sensors and acting upon that environment through actuators
Percepts
Inputs obtained through any sensor (e.g. camera) which the agent perceives the environment through
Actuators
Tools that the agent uses to act upon the environment
Abstract description of an agent, external
The agent function maps from percept histories to actions: [f: P* -> A]
Concrete implementation of an agent, internal
The agent program runs on physical architecture to produce f : agent = architecture + program
Rational Agent
For each possible percept sequence, a rational agent should select an action that is expected to maximize its performance measure, given the evidence provided by the percept sequence and whatever built-in knowledge the agent has.
Rationality at any moment depends on:
- the performance measure that defines the criterion of success
- the agents prior knowledge of the environment
- the actions that the agent can perform
- the agents percept sequence to date
Exploration as it relates to Rationality
to do with actions in order to modify the future percepts so that the agent would not do any uninformed action
Learning as it relates to Rationality
Adapting to changing environment.
A successful agent computes its agent function in 3 periods:
1) when it is designed by its designer
2) when it decides its next action
3) when it learns from experiences to decide how to modify its behavior (improve the rules)
Autonomy as it relates to Rationality
it should learn what it can to compensate for partial or incorrect prior knowledge, relies more on its percepts than on the prior knowledge of its designer
Task Environment
The problem to which the rational agent is a solution. Environment type largely determines agent design.
PEAS
Performance Measure
Environment
Actuators
Sensors
Environment types:
Fully Observable
(vs. partially observable)
An agent’s sensors give it access to the complete state of the environment at each point in time
Environment types:
Deterministic
(vs. stochastic)
The next state of the environment is completely determined by the current state and the action executed by the agent
Environment types:
Strategic
The environment is deterministic except for the actions of other agents