UNCERTAINTY Flashcards

1
Q

can be represented as a number of events and the likelihood, or probability, of each of them happening.

A

Uncertainty

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Axioms in Probability

A

0 < P(ω) < 1: every value representing probability must range between 0 and 1.

The probabilities of every possible event, when summed together, are equal to 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

the degree of belief in a proposition in the absence of any other evidence.

A

Unconditional probability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

the degree of belief in a proposition given some evidence that has already been revealed

A

Conditional probability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

a variable in probability theory with a domain of possible values that it can take on

A

random variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

is the knowledge that the occurrence of one event does not affect the probability of the other event.

A

independence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

P(a ∧ b) = P(a)P(b)

A

independence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

P(a | b)

A

conditional probability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

commonly used in probability theory to compute conditional probability

A

Bayes’ rule

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

In words, Bayes’ rule says that the probability of b given a is equal to the probability of a given b, times the probability of b, divided by the probability of a.

A

Bayes’ rule

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

the likelihood of multiple events all occurring

A

Joint probability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Use to deduce conditional probability

A

joint probability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

P(¬a) = 1 - P(a)

A

negation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

P(a ∨ b) = P(a) + P(b) - P(a ∧ b)

A

Inclusion-Exclusion

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

P(a) = P(a, b) + P(a, ¬b)

A

Marginalization

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

P(a) = P(a | b)P(b) + P(a | ¬b)P(¬b)

A

Conditioning

17
Q

data structure that represents the dependencies among random variables.

A

bayesian network

18
Q

Bayesian networks have the following properties:

A
  • They are directed graphs.
  • Each node on the graph represent a random variable.
  • An arrow from X to Y represents that X is a parent of Y. That is, the probability distribution of Y depends on the value of X.
  • Each node X has probability distribution P(X | Parents(X)).
19
Q

Inference has multiple properties:

A
  • Query X
  • Evidence variables E
  • Hidden variables Y
  • The goal: calculate P(X | e)
20
Q

a process of finding the probability distribution of variable X given observed evidence e and some hidden variables Y.

A

inference by enumeration

21
Q

one technique of approximate inference

22
Q

each variable is sampled for a value according to its probability distribution

23
Q

likelihood weighting steps:

A
  • Start by fixing the values for evidence variables.
  • Sample the non-evidence variables using conditional probabilities in the Bayesian network.
  • Weight each sample by its likelihood: the probability of all the evidence occurring.
24
Q

the probability of all the evidence occurring.

A

likelihood

25
an assumption that the current state depends on only a finite fixed number of previous states.
markov assumption
26
a sequence of random variables where the distribution of each variable follows the Markov assumption.
markov chain
27
To start constructing a Markov chain, we need a _____ ______ that will specify the the probability distributions of the next event based on the possible values of the current event.
transition model
28
a type of a Markov model for a system with hidden states that generate some observed event.
hidden markov model
29
assumption that the evidence variable depends only on the corresponding state
sensor markov assumption
30
Based on hidden Markov models, multiple tasks can be achieved:
- Filtering: given observations from start until now, calculate the probability distribution for the current state. For example, given information on when people bring umbrellas form the start of time until today, we generate a probability distribution for whether it is raining today or not. - Prediction: given observations from start until now, calculate the probability distribution for a future state. - Smoothing: given observations from start until now, calculate the probability distribution for a past state. For example, calculating the probability of rain yesterday given that people brought umbrellas today. - Most likely explanation: given observations from start until now, calculate most likely sequence of events.
31
Consider a standard 52-card deck of cards with 13 card values (Ace, King, Queen, Jack, and 2-10) in each of the four suits (clubs, diamonds, hearts, spades). If a card is drawn at random, what is the probability that it is a spade or a two?
About 0.308
32
Imagine flipping two fair coins, where each coin has a Heads side and a Tails side, with Heads coming up 50% of the time and Tails coming up 50% of the time. What is probability that after flipping those two coins, one of them lands heads and the other lands tails?
0.5 = 1/2
33
Which of the following sentences is true? Assuming we know the train is on time, whether or not there is rain affects the probability that the appointment is attended. Assuming we know there is rain, whether or not there is track maintenance does not affect the probability that the train is on time. Assuming we know there is track maintenance, whether or not there is rain does not affect the probability that the train is on time. Assuming we know the train is on time, whether or not there is track maintenance does not affect the probability that the appointment is attended. Assuming we know there is track maintenance, whether or not there is rain does not affect the probability that the appointment is attended.
Assuming we know the train is on time, whether or not there is track maintenance does not affect the probability that the appointment is attended.
34
Two factories — Factory A and Factory B — design batteries to be used in mobile phones. Factory A produces 60% of all batteries, and Factory B produces the other 40%. 2% of Factory A’s batteries have defects, and 4% of Factory B’s batteries have defects. What is the probability that a battery is both made by Factory A and defective?
0.012