Environment types (slides2) Flashcards
Fully Observable (vs. partially observable)
An agent’s sensors give it access to the complete state of the environment at each point in time
Deterministic
The next state is completely determined by the current state and the action executed by the agent
Strategic
the environment is deterministic except for the actions of other agents
stochastic
there is uncertainty on the next state and it is expressed with probabilities
Non-deterministic
there is uncertainty about the next state but no probabilities are available
uncertain
not fully observable and not deterministic
episodic
the agent’s experience is divided into atomic “Episodes” (each episode consists of the agent perceiving and then performing a single action) and the choice of action depends only on the episode itself. in sequential environments a current decision may affect future decisions.
Static (vs. Dynamic)
The environment is unchanged while an agent is deliberating. No need to keep looking at the world while deliberating nor worry about time.
semidynamic
the environment itself does not change with the passage of time but the agent’s performance score does
Discrete (vs. continuous)
A finite number of distinct, clearly defined states, percepts and actions. Applies also to time.
single agent (vs. multiagent)
An agent operating by itself in an environment
Known (vs. unknown)
depends on the knowledge of the agent or the designer of the agent of the rules governing the environment. In a known environment for each action there is an outcome (if deterministic) or a probability over the possible outcomes (if stochastic)
Chess w clock
Fully observable, Strategic, sequential, semidynamic, Discrete, not single agent
Chess no clock
Fully observable, Strategic, sequential, static, Discrete, not single agent
taxi driving
not Fully observable, not deterministic, not episodic, dynamic, continuous, multiagent