Agents Flashcards

1
Q

Definition of a rational Agent:

A

For each possible percept history, select an action
that is expected to maximize its performance measure, given the evidence by the percept history and
whatever built-in knowledge the agent has.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does PEAS stand for?

A

Performance measure, Environment, Actuators, Sensors. It is used to describe agents.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Fully/Partly observable environment…

A

sensors detect all relevant properties of the world for the current
action

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Single/Multi - agent environment…

A

only one agent, no cooperation and no competition between agents

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

deterministic / stochastic environment…

A

An environment is deterministic if the next state is perfectly predictable given knowledge of the previous state and the agent’s action. Stochastic, if there is a probability of the next state.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

episodic / sequential environment…

A

Sequential environments require memory of past actions to determine the next best action. Episodic environments are a series of one-shot actions, and only the current (or recent) percept is relevant. An AI that looks at radiology images to determine if there is a sickness is an example of an episodic environment. One image has nothing to do with the next.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

static / dynamic environment…

A

the world does not change during the reasoning time of the agent
semi-dynamic: static, but the performance score decreases with
deliberation time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

discrete / continous environment…

A

world properties have discrete values, e.g. time, number of possible
states,…

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

known / unknown environment…

A

An environment is considered to be “known” if the agent understands the laws that govern the environment’s behavior. For example, in chess, the agent would know that when a piece is “taken” it is removed from the game. On a street, the agent might know that when it rains, the streets get slippery.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

four basic types of agents:

A
  • Simple reflex agents (no memory, no sequences of percepts)
  • Model-based reflex agents (internal state -> memory, model of env.)
  • Goal-based agents (model of the world, explicit goal, search & planning)
  • Utility-based agents (“happiness”, utility function -> expected utility for decision, resolve conflicting goals)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly