deck_15613364 Flashcards
FOL
First Order Logic
In terms of FOL, what is the following:
Objects:
Relations:
Functions:
- Objects: people, houses, numbers, theories, Ronald McDonald, colors,
baseball games, wars, centuries . . . - Relations: red, round, bogus, prime, multistoried . . .,
brother of, bigger than, inside, part of, has color, occurred after, owns,
comes between, . . . - Functions: father of, best friend, third inning of, one more than, end of
Ontological Engineering
General and flexible representations for complex
domains.
Upper ontology:
The general framework of concepts
Categories and Objects
Stuff: a significant portion of reality that seems to defy any obvious
individuation—division into distinct objects
Intrinsic: they belong to the very substance of the object, rather than to the
object as a whole.
Extrinsic: weight, length, shape
Substance: a category of objects that includes in its definition only intrinsic
properties (mass noun).
Count noun: class that includes any extrinsic properties
© 2021 Pearson Education Ltd.
Mental Objects
Mental objects are knowledge in someone’s head (or KB)
Propositional attitudes that an agent can have toward mental objects
* Eg: Believes, Knows, Wants, and Informs
Lois knows that Superman can fly:
Knows(Lois, CanFly(Superman))
Modal Logic
Sentences can sometimes be verbose and clumsy. Regular logic is concerned with
a single modality, the modality of truth.
Modal logic addresses this, with special modal operators that take sentences
(rather than terms) as arguments
Semantic networks
- convenient to perform inheritance reasoning
- Eg: Mary inherits the property of having two legs. Thus, to find out how many
legs Mary has, the inheritance algorithm follows the MemberOf link from Mary
to the category she belongs to and then follows SubsetOf links up the hierarchy
until it finds a category for which there is a boxed Legs link
Description logics
- notations that are designed to make it easier to describe definitions and
properties of categories - evolved from semantic networks in response to pressure to formalize what the
networks mean while retaining the emphasis on taxonomic structure as an
organizing principle - Principal inference tasks:
- Subsumption: checking if one category is a subset of another by
comparing their definitions - Classification: checking whether an object belongs to a category
- The CLASSIC language (Borgida et al., 1989) is a typical description logic
- Eg: bachelors are unmarried adult males
- Bachelor = And(Unmarried, Adult, Male)
© 2021 Pearson Education Ltd.
Belief revision:
inferred facts will turn out to be wrong and will have to be
retracted in
Truth maintenance systems
or TMSs, are designed to handle complications of
any additional sentences that inferred from a wrong sentence.
Justification-based truth maintenance system (JTMS)
- Each sentence in the knowledge base is annotated with a justification
consisting of the set of sentences from which it was inferred - Justifications make retraction efficient
- Assumes that sentences that are considered once will probably be
considered again
fol basic elemets
Events
NLP
Natural Language Processing
N-GRAMS
Smoothing n-gram models
Atomic Model in N-grams
Part-of-speech (POS) tagging
Part-of-speech (POS) tagging
* way to categorize words (lexical category/tag)
* POS allows language models to capture
generalizations such as “adjectives generally come
before nouns in English”
* useful first step in many other NLP tasks, such as
question answering or translation
Big Libraries for this are NLTK and spaCy. Other libraries exist
in other languages but Python generally regarded as main NLP
language these days.
Grammar
A grammar is a set of rules that defines the tree structure of allowable phrases
* A language is the set of sentences that follow those rules.
* Syntactic categories such as noun phrase or verb phrase help to constrain the
probable words at each point within a sentence
* The phrase structure provides a framework for the meaning or semantics of the
sentence
Probabilistic context-free grammar (PCFG)
* A probabilistic grammar assigns a probability to each string
* “context-free” means that any rule can be used in any context
Parsing
Parsing is the process of analyzing a string of words to uncover its phrase
structure, according to the rules of a grammar.
* Search for a valid parse tree whose leaves are the words of the string
* can start with the S symbol and search top down, or we can start with the
words and search bottom up.
* Inefficiency: If the algorithm guesses wrong, it will have to backtrack all
the way to the first word and reanalyze the whole sentence under the
other interpretation.
* every time we analyze a substring, store the results so we won’t have to
reanalyze it later.
Parsing Tree
dependency grammar:
- assumes that syntactic structure is formed by binary relations between lexical
items, without a need for syntactic constituents - phrase structure tree is annotated with the head of each phrase
- recover dependency tree
- Convert dependency tree to phrase structure with arbitrary categories
Learn Parser From Examples
Learning semantic grammars
Pragmatics:
- resolving the meaning of indexicals, which are phrases
that refer directly to the current situation - Example sentence: “I am in Boston today,” both “I” and
“today” are indexicals. The word “I” would be represented
by Speaker, a fluent that refers to different objects at
different times - interpreting the speaker’s intent
- The speaker’s utterance is considered a speech act, and it
is up to the hearer to decipher what type of action it is
(question, a statement, a promise, a warning, a command,
etc.)
Time and tense:
Ambiguity:
Disambiguation
Deep Learning
Feedforward network
- connections only in one direction (input to output)
- directed acyclic graph with designated input and output nodes
- No loops
Recurrent network
- its intermediate or final outputs back into its own inputs.
- signal values within the network form a dynamical system that has internal
state or memory
Networks as complex functions
Different activation functions