7. Advanced Probability Concepts Flashcards

1
Q

What is joint probability?

A

The probability of two events occurring together: P(A ∩ B).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is marginal probability?

A

The probability of an event occurring regardless of other variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a moment generating function?

A

A function that generates moments (mean, variance, etc.) of a probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the law of total expectation?

A

E[X] = E[E[X|Y]].

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is Chebyshev’s inequality?

A

A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is entropy in information theory?

A

A measure of uncertainty in a probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is mutual information?

A

A measure of dependence between two random variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the Kullback-Leibler (KL) divergence?

A

A measure of how one probability distribution diverges from another.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is Jensen’s inequality?

A

For a convex function g, E[g(X)] ≥ g(E[X]).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is covariance?

A

A measure of how two random variables vary together.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is joint probability?

A

The probability of two events occurring together: P(A ∩ B).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is marginal probability?

A

The probability of an event occurring regardless of other variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is a moment generating function?

A

A function that generates moments (mean, variance, etc.) of a probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the law of total expectation?

A

E[X] = E[E[X|Y]].

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is Chebyshev’s inequality?

A

A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is entropy in information theory?

A

A measure of uncertainty in a probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is mutual information?

A

A measure of dependence between two random variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the Kullback-Leibler (KL) divergence?

A

A measure of how one probability distribution diverges from another.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is Jensen’s inequality?

A

For a convex function g, E[g(X)] ≥ g(E[X]).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What is covariance?

A

A measure of how two random variables vary together.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What is joint probability?

A

The probability of two events occurring together: P(A ∩ B).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What is marginal probability?

A

The probability of an event occurring regardless of other variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What is a moment generating function?

A

A function that generates moments (mean, variance, etc.) of a probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What is the law of total expectation?

A

E[X] = E[E[X|Y]].

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

What is Chebyshev’s inequality?

A

A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

What is entropy in information theory?

A

A measure of uncertainty in a probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

What is mutual information?

A

A measure of dependence between two random variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

What is the Kullback-Leibler (KL) divergence?

A

A measure of how one probability distribution diverges from another.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

What is Jensen’s inequality?

A

For a convex function g, E[g(X)] ≥ g(E[X]).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

What is covariance?

A

A measure of how two random variables vary together.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

What is joint probability?

A

The probability of two events occurring together: P(A ∩ B).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

What is marginal probability?

A

The probability of an event occurring regardless of other variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

What is a moment generating function?

A

A function that generates moments (mean, variance, etc.) of a probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

What is the law of total expectation?

A

E[X] = E[E[X|Y]].

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

What is Chebyshev’s inequality?

A

A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

What is entropy in information theory?

A

A measure of uncertainty in a probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

What is mutual information?

A

A measure of dependence between two random variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

What is the Kullback-Leibler (KL) divergence?

A

A measure of how one probability distribution diverges from another.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

What is Jensen’s inequality?

A

For a convex function g, E[g(X)] ≥ g(E[X]).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

What is covariance?

A

A measure of how two random variables vary together.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

What is joint probability?

A

The probability of two events occurring together: P(A ∩ B).

42
Q

What is marginal probability?

A

The probability of an event occurring regardless of other variables.

43
Q

What is a moment generating function?

A

A function that generates moments (mean, variance, etc.) of a probability distribution.

44
Q

What is the law of total expectation?

A

E[X] = E[E[X|Y]].

45
Q

What is Chebyshev’s inequality?

A

A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.

46
Q

What is entropy in information theory?

A

A measure of uncertainty in a probability distribution.

47
Q

What is mutual information?

A

A measure of dependence between two random variables.

48
Q

What is the Kullback-Leibler (KL) divergence?

A

A measure of how one probability distribution diverges from another.

49
Q

What is Jensen’s inequality?

A

For a convex function g, E[g(X)] ≥ g(E[X]).

50
Q

What is covariance?

A

A measure of how two random variables vary together.

51
Q

What is joint probability?

A

The probability of two events occurring together: P(A ∩ B).

52
Q

What is marginal probability?

A

The probability of an event occurring regardless of other variables.

53
Q

What is a moment generating function?

A

A function that generates moments (mean, variance, etc.) of a probability distribution.

54
Q

What is the law of total expectation?

A

E[X] = E[E[X|Y]].

55
Q

What is Chebyshev’s inequality?

A

A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.

56
Q

What is entropy in information theory?

A

A measure of uncertainty in a probability distribution.

57
Q

What is mutual information?

A

A measure of dependence between two random variables.

58
Q

What is the Kullback-Leibler (KL) divergence?

A

A measure of how one probability distribution diverges from another.

59
Q

What is Jensen’s inequality?

A

For a convex function g, E[g(X)] ≥ g(E[X]).

60
Q

What is covariance?

A

A measure of how two random variables vary together.

61
Q

What is joint probability?

A

The probability of two events occurring together: P(A ∩ B).

62
Q

What is marginal probability?

A

The probability of an event occurring regardless of other variables.

63
Q

What is a moment generating function?

A

A function that generates moments (mean, variance, etc.) of a probability distribution.

64
Q

What is the law of total expectation?

A

E[X] = E[E[X|Y]].

65
Q

What is Chebyshev’s inequality?

A

A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.

66
Q

What is entropy in information theory?

A

A measure of uncertainty in a probability distribution.

67
Q

What is mutual information?

A

A measure of dependence between two random variables.

68
Q

What is the Kullback-Leibler (KL) divergence?

A

A measure of how one probability distribution diverges from another.

69
Q

What is Jensen’s inequality?

A

For a convex function g, E[g(X)] ≥ g(E[X]).

70
Q

What is covariance?

A

A measure of how two random variables vary together.

71
Q

What is joint probability?

A

The probability of two events occurring together: P(A ∩ B).

72
Q

What is marginal probability?

A

The probability of an event occurring regardless of other variables.

73
Q

What is a moment generating function?

A

A function that generates moments (mean, variance, etc.) of a probability distribution.

74
Q

What is the law of total expectation?

A

E[X] = E[E[X|Y]].

75
Q

What is Chebyshev’s inequality?

A

A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.

76
Q

What is entropy in information theory?

A

A measure of uncertainty in a probability distribution.

77
Q

What is mutual information?

A

A measure of dependence between two random variables.

78
Q

What is the Kullback-Leibler (KL) divergence?

A

A measure of how one probability distribution diverges from another.

79
Q

What is Jensen’s inequality?

A

For a convex function g, E[g(X)] ≥ g(E[X]).

80
Q

What is covariance?

A

A measure of how two random variables vary together.

81
Q

What is joint probability?

A

The probability of two events occurring together: P(A ∩ B).

82
Q

What is marginal probability?

A

The probability of an event occurring regardless of other variables.

83
Q

What is a moment generating function?

A

A function that generates moments (mean, variance, etc.) of a probability distribution.

84
Q

What is the law of total expectation?

A

E[X] = E[E[X|Y]].

85
Q

What is Chebyshev’s inequality?

A

A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.

86
Q

What is entropy in information theory?

A

A measure of uncertainty in a probability distribution.

87
Q

What is mutual information?

A

A measure of dependence between two random variables.

88
Q

What is the Kullback-Leibler (KL) divergence?

A

A measure of how one probability distribution diverges from another.

89
Q

What is Jensen’s inequality?

A

For a convex function g, E[g(X)] ≥ g(E[X]).

90
Q

What is covariance?

A

A measure of how two random variables vary together.

91
Q

What is joint probability?

A

The probability of two events occurring together: P(A ∩ B).

92
Q

What is marginal probability?

A

The probability of an event occurring regardless of other variables.

93
Q

What is a moment generating function?

A

A function that generates moments (mean, variance, etc.) of a probability distribution.

94
Q

What is the law of total expectation?

A

E[X] = E[E[X|Y]].

95
Q

What is Chebyshev’s inequality?

A

A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.

96
Q

What is entropy in information theory?

A

A measure of uncertainty in a probability distribution.

97
Q

What is mutual information?

A

A measure of dependence between two random variables.

98
Q

What is the Kullback-Leibler (KL) divergence?

A

A measure of how one probability distribution diverges from another.

99
Q

What is Jensen’s inequality?

A

For a convex function g, E[g(X)] ≥ g(E[X]).

100
Q

What is covariance?

A

A measure of how two random variables vary together.