Building a Conversational Language Understanding model Flashcards
which 3 factors are captured by CLU model to support a conversation
Utterance
Intention
Entities
What would be a reasonable mapping of utterance, intention and entities in programming terms
Utterance can be mapped to user or system input
Intention can be mapped to a sub routine e.g. GetDevice, TurnLightOn
Entities are extracted from utterance input as potential objects e.g. DateTime, People, Locations…
Two categories of Azure AI Language features and 3 examples of each
Built in features: Sentiment, Language, Key phrases, summarisation of text, PII, PHI, orchestration workflow, named entities, text analytics for Health
Learned features:
CLU, QnA, custom named entities, custom text classifcation
How is patterns used to improve CLU
patterns are used to disambiguate utterance to determine correct user intention, in case similar terms are used to mean different things
Generally, multiple version of utterance are used to train CLU to understand the same intention by using different version of phrasing. However, patterns are used for the exact opposite where similar utterance are used to determine difference intention e.g. Is light on? VS Turn light on, where difference structure and punctuation attached to ‘light on’ leads to different intent