Intelligent Agents Flashcards
PEAS
Performance Measure - the criteria by which we can measure the success of the agent’s behaviour.
Environment - the surroundings
Actuators - the agent’s actions with the environment
Sensors - how the agents sense the environment
Full Observable
Partially Observable
An agent can fully obtain complete, up-to-date info about the environment’s state.
Partially observable environments can be like poker, with no way to fully obtain everything about the environment’s state.
Deterministic
Stochastic
Has a single guaranteed effect with no uncertainty.
The effect is not guaranteed.
Episodic
Sequential
Episodic environments have discrete episodes with no links between them (independent).
Sequential environments have discrete episodes with links between them (dependent).
Static
Dynamic
An environments that can be assumed to stay static.
An environment that can be assumed to have other processes on it, hence changes whilst the agent is deliberating (the physical world)
Discrete
Continuous
Finite/fixed number of states.
Infinite/unfixed number of states.
Simple Reflex Agents
Select actions to execute based upon the current precept - does not store the current climate (do not have a memory).
Model-Based Agents
Maintain an internal state that depends upon the precept history.
Goal-Based Agents
Select appropriate actions to achieve desirable states of the environment (goal)
Utility-Based Agents
Makes use of a utility function to compare desirability of different states that result from actions.
Learning Agents
Learn on previous experience, performance standard defines what is a good outcome of an action, problem generator creates problems for learning.
Improves its own performance ; “Unlike intelligent agents that act on the information provided by a programmer, learning agents are able to perform tasks, analyze performance and look for new ways to improve on those tasks.”
Agent
Perceiving its environment through sensors and acting via actuators.