chapter 2 Flashcards

1
Q

what are 2 approaches to probability?

A
  • Frequentist interpretation
  • Bayesian interpretation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

describe the Frequentist interpretation.

A
  • probabilities represent long run frequencies of events.
  • For example: if we flip the coin many times, we expect it to land heads about half the time.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

describe the Bayesian interpretation.

A
  • probability is used to quantify our uncertainty about something; hence it is fundamentally related to information rather than repeated trials.
  • For example: if we flip the coin we believe the coin is equally likely to land heads or tails on the next toss.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

______ can be used to model uncertainty about events that do not have long term frequencies? give example.

A
  • Bayesian interpretation
  • For example, we might have received a specific email message, and want to compute the probability it is spam.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

how can we calculate union probability?

A

p(A ∨ B) = p(A) + p(B) − p(A ∧ B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

how can we calculate joint probability?

A
  • p(A ∧ B) = p(A|B) * p(B)
  • p(A ∧ B) = p(B|A) * p(A)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

how can we calculate conditional probability?

A
  • p(A|B) = p(A n B) / p(B) …. if p(B) > 0
  • prob of a given b
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

describe bayes Prior / state of nature?

A
  • Is the probability of an event occurring before new data is collected, based on current knowledge
  • reflect the prior knowledge of how likely we will get sth
  • Priors are known before the training process
  • The state of nature is a random variable P(wi).
  • If there are only two classes, the sum of the priors is P(w1) + P(w2) = 1
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

describe bayes Likelihood / Class conditional probabilities?

A
  • It represents the probability of how likely a feature x occurs given that it belongs to the particular class, wi.
  • It is denoted by, P(X|wi)
  • It is the quantity that we have to evaluate while training the data.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

describe bayes Evidence

A
  • It is the probability of occurrence of a particular feature i.e. P(X).
  • We also figure out evidence values during training.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

how can we calculate evidence in Bayes theory?

A

using the chain rule as, P(X) = Σin P(X|wi)P(wi)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

describe Bayes Posterior Probabilities?

A

 It is the probability of occurrence of Class wi when certain Features are given P(wi|X)

 It is what we aim at computing in the test phase in which we have testing input [ features ] and we have to find how likely trained model can predict features belonging to the particular class wi.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

what should our predictions be in the case of:
1. P(ω1)»> P(ω2)
2. P(ω1)= P(ω2)
3. the probability of error

A
  1. decision should favor of ω1
  2. it is half probable for our prediction being right
  3. the minimum of P(ω1) and P(ω2)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly