Deductive reasoning, practical reasoning, ontologies Flashcards

1
Q

define a ontology

A

a formal, explicit specification of a shared conceptualisation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the semantic web approach?

A

representing information in machine-readable form

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the components of ontologies?

A

Individuals, Classes, Attributes, Relations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

There are two types of ontologies. Which?

A

Formal and informal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

There are 3 species of OWL, which?

A

OWL Full, OWL DL (description logic), OWL Lite

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Which properties does OWL have?

A

object properties: which relate objects to other objects and data type properties, which relate object to datatype values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does OWL stand for?

A

Web Ontology Language

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How does OWL assume the world?

A

It follows an open-world assumption ; a formal, explicit specification of a shared conceptualisation due to big WWW correct. Also follows Unique names Assumption

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Ontologies provide a shared understanding of a domain:

A

semantic interoperability; Overcome differences in terminology and mappings between ontologies are possible

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are the differences between the 3 species of OWL?

A

expressiveness & complexity => + features, completeness of reasoning, compatability with the others, decidability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Give one example for an alternative to OWL:

A

Cyc, Dublin Core, WordNet

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

The representation/reasoning problem:

A

How to symbolically represent information about complex real-world entities and processes, and how to get agents to reason with this information in time for the results to be useful?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Deductive Reasoning Agents

A

Decide what to do on the basis of a theory stating the best action to perform in any given situation. Example is the Vacuum cleaner.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Agent oriented programming:

A

programming agents with intentional notations like intention, belief and commitment => the intentional stance is used as an abstraction tool for programming!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Shoham suggested that a complete AOP system will have 3 components

A

logic for specifying agents and describing their mental states, an interpreted programming language for programming agents, for example AGENT0 and an ‘agentification’ process, for converting ‘neutral applications’ (databases etc) into agents.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Shoham suggested that a complete AOP system will have 3 components

A

logic for specifying agents and describing their mental states, an interpreted programming language for programming agents, for example AGENT0 and an ‘agentification’ process, for converting ‘neutral applications’ (databases etc) into agents.

17
Q

Agents in AGENT0 have four components:

A

a set of capabilities of the agent, initial beliefs, initial commitments (things the agent will do), commitment rules (each one containing a message)

18
Q

Give a short description of METATEM

A

based on direct execution of logical formulas, temporal logic specification of the behavior an agent should exhibit. Works by some symbols indicating always, at some time, tomorrow, yesterday etc.

19
Q

Define practical reasoning

A

reasoning directed towards actions (“what to do?”)

20
Q

theoretical reasoning

A

directed towards beliefs

21
Q

deliberation

A

deciding what state of affairs we want to achieve

22
Q

Human practical reasoning consists of two activities

A

deliberation and means-end deliberation

23
Q

What are the intentions in practical reasoning?

A

intentions drive means-end reasoning, intentions persists, intensions constrain future deliberation, Intentions influence beliefs upon which future practical reasoning is based

24
Q

Plan:

A

given a planning problem (Beliefs, Goals, Possible Actions), a is a sequence of actions aimed at reaching the goal.

25
Q

What are the intentions in practical reasoning?

A

intentions drive means-end reasoning, intentions persists, intensions constrain future deliberation, Intentions influence beliefs upon which future practical reasoning is based

26
Q

Problems with planning

A
  • Frame problem: describe what does not change by an action
  • Qualification problem: describe all preconditions of an action
  • Ramification problem: describe all consequences of an action
  • Prediction problem: describe the duration that something remains true
27
Q

Blind/fanatical commitment

A

maintain an intention until you believe the intention has actually been achieved

28
Q

Single-minded commitment

A

maintain an intention until you believe that either the intention has been achieved, or else that it is no longer possible to achieve the intention

29
Q

Open-minded commitment

A

maintain an intention until you believe that either the intention has been achieved, or else that it is no longer one of your goals.

30
Q

OWL reasoning examples

A

consistency checking (implicit/explicit contradictions) and concept satisfiability

31
Q

What can be expressed in OWL

A

thesaurus, glossary, informal taxonomies, formal taxonomies, controlled vocabulary

32
Q

What do the following parts do in a practical reasoning program?
brf, option, filter, plan

A

brf: update beliefs
option: determine desires
filter: choose desires to act on
* plan: plan with beliefs and desires