Intelligent UIs Flashcards
what is an Intelligent UI?
one that uses machine learning and other techniques from AI to assist the user
- recommender systems
- Amazon Echo (virtual assistant)
what is a message? how do we quantify it?
A sequence of symbols (such as 0s and 1s)
the self information I of a message m in bits is:
I(m) = log2(1/P(m)
where P(m) is the probability a message will appear
how much information do probable, improbable and guaranteed messages carry?
probable: less than improbable
guaranteed: 0
what is entropy?
H is a measure of the uncertainty in the message space M
H(M) = sumof p(m)I(m)
if the entropy is 0 everything is completely predictable
entropy tells us the average number of bits we need to encode a message
what is bandwidth?
the number of bits per second we can communicate between humans and machines
shorter messages increase bandwidth. we can shorten messages by having redundancies e.g. 4A over AAAA
what is a redundancy?
the difference between the average number of bits actually used and the optimal number of bits that are nessecary to uniquely encode all messages is a measure of redundancy
what is modelling?
modelling redundancies means determining a way to encode messages in some fashion which is as close as possible to an optimal coding
e.g. language models: probability of the next word or phrase
touchscreen gestures: probability of the user’s intended trajectories
what are zero-fourth order approximation in language modelling?
0: random
1: sampling based on probabilities of letters occuring in english texts
2: every letter is dependant on its previous letter according to probabilities in english
3: as 2 but dependant of the previous 2 letters
what is Zipf’s law?
the 100 most common british words make up 46% of the entire corpus
zipfs law estimates the probability Pr of occurence of a word in a corpus
to be proportional to 1/r^alpha, where alpha is the statisticla rank of the word in decreasing order and r is close to one
what is a unigram model
one that assumes any word can follow another with equal probability - not good
what is the markov assumption?
the probability of a word only depends on a finite set of previous words
what is a bigram model? what is an n gram model?
approximates the probability of a word given the single previous word
n gram model has n-1 words of context
what is perplexity?
the weighted average number of choices a random variable has to make and is 2^entropy
what are some design issues for an intelligent UI?
error correction - users can correct recognition errors quickly
supporting error detection - help users detect errors made by the system
transparency - the user knows why their input was interpreted the way it was
agency - users feel in control
TASE