Probability Flashcards

1
Q

Definition of sample space

A

In an experiment, the set of all possible outcomes is the sample space (S)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Definition of an event and written form

A

Any outcome in S (E), where E={x in S: x in E}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

7 set operations

A

Union, intersection, compliment, communicative, associative, distributive, De Morgan’s Law

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Definition of disjoint/ mutually exclusive events

A

Events A and B are mutually exclusive when their intersection is the empty set

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Definition of pairwise mutually exclusive

A

For any two subsets of S (A1,A2,A3,…), their intersection is the empty set

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Partition

A

If A1,A2,A3… are pairwise mutually exclusive and all these sets comprise a set S, then the set {A1,A2,A3,…} partitions S

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Definition of sigma algebra

A

B, a collection of subsets in S is a sigma algebra if it satisfies the following properties 1) The empty set is contained in B 2) If A is in B then A^c is in B 3) If A1,A2,A3,… are in B then UAi are in B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the largest number of sets in a sigma algebra, B, with n sets?

A

2^n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Definition of probability

A

Given a sample space S with sigma algebra B, a probability function (or measure) is any assigned, real-valued function P with domain B that satisfies the Kolmogorov Axioms.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Kolmogorov Axioms

A

1) P(A)>=0, for any A in B 2) P(S)=1 3) For A1,A2,A3,… in B and are pairwise mutually exclusive then P(UAi)=SUM P(Ai)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

1) Gamma Distribution with different parameterizations
2) Expected values and variances of those distributions
3) Gamma funciton
4) Properties of Gamma funciton
5) MGF

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

1) Exponential Distribution with different Parameterizations
2) Different expected values and variances
3) MGF

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

1) Bernoulli Distribution
2) Expected value and variance
3) MGF

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

1) Geometric Distribution with different parameterizations
2) Expected values and variances
3) MGF (Also special rule to help solve this)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

1) Poisson Distribution
2) Expected value and variance
3) MGF

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

1) Binomial Distribution
2) Expected value and variances
3) MGF

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

1) Beta Distribution
2) Expected value and variance
3) Beta Funciton
4) Expectation of nth term

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

1) Bivariate Normal
2) Conditional expectation
3) Conditional variance

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

1) Normal and standard normal Distributions
2) Expected values and variances
3) MGFs

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

1) Continuous Uniform Distribution
2) Expected value and variance
3) MGF

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

1) Multinomial Distribution
2) Expected value and variance
3) Multinomial Theorem
4) cov(x_i,x_j)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Bonferonni Inequality

A

Pr(A∩B)≥Pr(A)+Pr(B)-1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Table of ordered, non-ordered, with replacement, without replacement

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Fundamental Theorem of Counting

A

For a job which consists of k tasks and tere are ni ways to accomplish each ith task then the job can be accomplished in (n1n2…nk) ways

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Inequality between unordered with replacement and without replacement

A

Opposite of this

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Binomial Theorem

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Paschal’s Formula

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Three useful properties of binomial coefficients

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Bayes’ rule (for 2 sets and generally)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Definition of conditional independence

A

A is conditionally independent of C given B if P[A|B,C]=P[A|B]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

Total Law of Probability

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Definition of a random variable

A

A function from a sample space S into the real numbers

Formally: P[X=xi]=P[sj in S: X(sj)=xi]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

Conditions for a function to be a CDF (iff)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

If RV’s have the same cdf then…

A

X and Y are iid

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

Can a RV be both discrete and continuous?

A

Yes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

If X and Y are iid (FX(x)=FY(x)) then does this mean X=Y?

A

No

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

A functin f(x) is a pdf (or pmf) iff

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

Definition of absolutely continuous x

A

X is absolutely continuous when it is both continuous and differentiable for all x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

For a RV x, and Y=g(x), what does fy(y) equal? (A transformation of random variable)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

Let X have cdf FX(x), let Y=g(X), and let X={x: fX(x)>0} and Y={y:y=g(x) for some x in X}

What if g is an increasing/ decreasing function on X?

A

1) If g is an increasing function on X, FY(y)=FX(g-1(y))
2) If g is a decreasing function on X and X is a continuous random variable, FY(y)=1-FX(g-1(y))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

Is it always true that E[g(x)]=g(E[x])?

A

No

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

How to find a moment generating function MX(t) and the moments of a probability distribution

A

MX(t)=E[etx]

dn/dtn MX(t=0)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

Three properties of MGFs

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
44
Q

3 mathematical properties

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
45
Q

Explain hypergeometric distribution in words

A

In the presence of two options, the distribution evaluates K selected from M objects from option 1 out of N total objects.

46
Q

Explain the negative binomial in words

A

Explains how many trials need to occur to obtain n successes

47
Q

Memoryless Property

A
48
Q

Shapes of Beta distributions

A
49
Q

General idea of Poisson process

A
50
Q

Two major ways to determine exponential families with respective terms explained

A
51
Q

Definition of curved and full exponential family

A

A curved exponential family is when the number of parameters for an exponential family is less than k

A full exponential family is when the number of parameters equals k

52
Q

Definitions of location, scale, and location-scale families

A

Location family is one that takes a pdf and includes it in the family of pdfs indexed by the finite parameter u such that f(x-u) remains in the family

Scale family is one that takes a pdf and includes it in the family of pdfs indexed by the positive parameter s such that (1/s)f(x/s) remains in the family

Location-scale is just a combination of the two

53
Q

Markov Inequality

A
54
Q

Chebyshev’s Inequality

A
55
Q

How to find E[X2|X1] given a joint probability function for the discrete and continuous cases

A
56
Q

Given f(X1, X2) what is E[X2] (for discrete and continuous case)

A
57
Q

If X1 and X2 are independent and Z=X1 + X2 then what does this say about the mgf’s for these variables?

A

Mz(t)=MX1(t)MX2(t)

58
Q

What is the equation for bivariate transformations for the continuous case?

A
59
Q

If a joint probability function can be factorized what does this say about its factors?

A

They are independent

60
Q

Basic idea of Hierarchical models

A

You are given f(X|Y) and f(Y) and need to find f(X)

61
Q

E[X] in terms of conditional expectations for hierarchical models

A

EY[EX[X|Y]]

62
Q

Var(Y) in terms of conditional expectations for hierarchical models

A

EX[VarY(Y|X)]+VarX(EY[Y|X])

63
Q

Two forms of covariance

A
64
Q

Correlation equation

A
65
Q

cov(aX,bY)=

A

acov(X,Y)b

66
Q

cov(X+Y,W+Z)=

A

cov(X,Z)+cov(X,W)+cov(Y,W)+cov(Y,Z)

67
Q

Given data, basic difference between Classical and Bayesian approach

A

Classical model uses simulated or given distribution to obtain some fixed, unknown constant (the parameter)

The Bayesian model assumes the parameter is a random variable and relies on prior knowledge of this variable and posterior enhancement through collected data. This method utilizes the idea of exchangability (conditional independence) for the samples

68
Q

Cauchy-Schwarz Inequality

A
69
Q

Holder’s Inequality

A
70
Q

Jensen’s Inequality

A
71
Q

If x1,x2,x3,… are mutually independent then what does this say about the function of this vector of mutually independent x’s, the product of Expected transformations for each x, and their MGFs?

A
72
Q

If X1,X2,X3,… are independent then what does this say about the transformation of these vectors?

A
73
Q

What is the “mission” of a statistician?

A

To learn from data (by obtaining a sample) to make judegements about the unknown (through populations and their parameters)

74
Q

What is the connection between a sample and a population?

A

Probability (a measure of randomness/stochasticity)

75
Q

What is a statistic

A

A summary of the sample

76
Q

Equation for S2

A
77
Q

Convergence in Probability

A
78
Q

WLLN

A
79
Q

Convergence in Distribution

A
80
Q

Convergence almost surely

A
81
Q

SLLN

A
82
Q

Definition of consistent

A

A statistic is consistent when it converges in probability to the truth

83
Q

Central Limit Theorem

A
84
Q

Comparison of convergence almost surely and convergence in probability

A

CAS is stronger, if CAS holds then CIP holds, but not always the converse

85
Q

Comparison between convergence in probability and convergence in distribution

A

CIP is stronger because if CIP holds then CID holds, but not always the converse

86
Q

Slutsky’s Theorem

A
87
Q

Three things necessary for proving x follows a t distribution

A
  1. E[X] (X bar) and S2 are independent
  2. E[X}~N(u,sig2/n)
  3. (n-1)/(sig2)S2~X2n-1
88
Q

Things to remember:

a) If each xi follows a normal then x2i follows what dist?
b) SUM( x2i) follows what distribution (sum to n)
c) A chi squared distribution (with p degreees of freedom) follows what distribution with certian parameters?
d) SUM(xi-E[X])=?
e) SUM(xi-u)2=?
f) S2=? (not just usual equation)

A
89
Q

t statistic and distribution

A
90
Q

1) Binomial to Poisson
2) Binomial to Bernoulli
3) Binomial to Normal

A
91
Q

Bernoulli to Binomial

A

SUM Xi

92
Q

Hypergeometric to Binomial

A

p=M/N, n=K, N->infinity

93
Q

1) Beta to Normal
2) Beta to Continuous Uniform

A

1) alpha=beta->infinity
2) alpha=beta=1

94
Q

1) Negative Binomial to Poisson
2) Negative Binomial to Geometric

A
95
Q

1) Geometric to Negative Binomial
2) Geometric to itself

A
96
Q

1) Poisson to Normal
2) Poisson to itself

A
97
Q

1) Normal to itself
2) Normal to standard normal
3) Normal to lognormal

A
98
Q

1) Gamma to Exponential
2) Gamma to Normal
3) Gamma to Beta
4) Gamma to Chi-squared

A
99
Q

1) Exponential to Continuous uniform
2) Exponential to Gamma
3) Exponential to Chi-squared

A
100
Q

1) Chi-squared to itself
2) Chi-squared to F
3) Chi-squared to Exponential

A
101
Q

1) Standard Normal to Cauchy
2) Standard Normal to Chi-Squared

A
102
Q

F to Chi-squared

A
103
Q

1) t to F
2) t to Standard normal
3) t to Cauchy

A
104
Q

Cauchy to itself (two ways)

A

1) SUM Xi
2) 1/X

105
Q

1) Hypergeometric distribution
2) Expected value and variance

A
106
Q

1) Negative Binomial distribution
2) Expected value and variance
3) MGF

A
107
Q

1) Cauchy distribution
2) Expected value and variance
3) MGF

A
108
Q

1) Chi-squared distribution
2) Expected value and variance
3) MGF

A
109
Q

1) F distribution
2) Expected value and variance
3) MGF

A
110
Q

1) Lognormal distribution
2) Expected value and variance
3) MGF

A
111
Q

1) t distribution
2) expected value and variance
3) MGF

A