2 - agents Flashcards
Agent - definition and explanation
an AGENT is anything that can be viewed as percieving an enviroment - so it can read the world through senses and react based on the input sensors recieve
rational agent
a system that maximazes its performance measure based on the percept sequence and built-in knowledge
Task Environment Specification
(PEAS) - Performance measure, Environment, Actuators, Sensors
percept
the agents’ input
percept sequence
complete history of everything the agent percieved
agent function
maps given percept sequence into an action
agent program
runs the physical architecture
Table-driven agent
daunting sizes of tables, physcial agents cannot store
Simple reflex agent
selection based only on current perception; condition-action rules => if - then algorithms
Model based reflex agent
has an internal state, storing apsects of the enviroment that cant be immediently observed - keeps that knowledge, comapres current state and adapts
Goal based
select actions in current state to complete the goal
Utility based agent
evaluate the desireability of states resulting from each possible action by mapping states onto a number
Learning/autonomous agent
methods for selecting actions - teach instead of instructing
Fully observable vs. partially observable
Do the agent’s sensors give it access to the complete state of the environment?
Deterministic vs.stochastic
Is the next state of the environment completely determined by the current state
and the agent’s action?
Episodic vs. sequential
Is the agent’s expereince divided into individual episodes or is it a sequence of observations and actions?
Static vs. dynamic
Is the enviroment changing while the agent is thinking?
Discrete vs. continuous
Does the enviroment have a fixed number of distinct percepts?
Single agent vs. multi-agent
Is the agent operating by itself in the enviroment?