Intelligent UIs Flashcards

1
Q

what is an Intelligent UI?

A

one that uses machine learning and other techniques from AI to assist the user

  • recommender systems
  • Amazon Echo (virtual assistant)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

what is a message? how do we quantify it?

A

A sequence of symbols (such as 0s and 1s)
the self information I of a message m in bits is:

I(m) = log2(1/P(m)

where P(m) is the probability a message will appear

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

how much information do probable, improbable and guaranteed messages carry?

A

probable: less than improbable
guaranteed: 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

what is entropy?

A

H is a measure of the uncertainty in the message space M

H(M) = sumof p(m)I(m)

if the entropy is 0 everything is completely predictable

entropy tells us the average number of bits we need to encode a message

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

what is bandwidth?

A

the number of bits per second we can communicate between humans and machines

shorter messages increase bandwidth. we can shorten messages by having redundancies e.g. 4A over AAAA

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what is a redundancy?

A

the difference between the average number of bits actually used and the optimal number of bits that are nessecary to uniquely encode all messages is a measure of redundancy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

what is modelling?

A

modelling redundancies means determining a way to encode messages in some fashion which is as close as possible to an optimal coding

e.g. language models: probability of the next word or phrase
touchscreen gestures: probability of the user’s intended trajectories

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

what are zero-fourth order approximation in language modelling?

A

0: random
1: sampling based on probabilities of letters occuring in english texts
2: every letter is dependant on its previous letter according to probabilities in english
3: as 2 but dependant of the previous 2 letters

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

what is Zipf’s law?

A

the 100 most common british words make up 46% of the entire corpus

zipfs law estimates the probability Pr of occurence of a word in a corpus

to be proportional to 1/r^alpha, where alpha is the statisticla rank of the word in decreasing order and r is close to one

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

what is a unigram model

A

one that assumes any word can follow another with equal probability - not good

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

what is the markov assumption?

A

the probability of a word only depends on a finite set of previous words

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

what is a bigram model? what is an n gram model?

A

approximates the probability of a word given the single previous word

n gram model has n-1 words of context

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

what is perplexity?

A

the weighted average number of choices a random variable has to make and is 2^entropy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

what are some design issues for an intelligent UI?

A

error correction - users can correct recognition errors quickly
supporting error detection - help users detect errors made by the system
transparency - the user knows why their input was interpreted the way it was
agency - users feel in control

TASE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly