2. Random Variables Flashcards

1
Q

Discrete vs Continuous Random Variables

A
  • discrete random variables can only take on a finite or at most countably infinite number of values
  • continuous random variables can take on a continuum of values
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Probability of an Event

A

-for a discrete random variable X and a real value a, the event X=a is the set of outcomes in Ω for which the random variable assumes the value a
-i.e. X=a ≡ {ω∈Ω | X(ω)=a}
-the probability of this event is:
Pr(X=a) = Σ Pr(ω)
-where the sum is taken over ω∈Ω such that X(ω)=a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Probability Mass Function

A

-frequency or probability mass function (PMF) of a discrete random variable X gives all the probabilities for the different possible values of X:
px(x) = Pr(X=x)
-it must satisfy the property:
Σ px(x) = Σ Pr(X=x) = 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Cumulative Distribution Function

A
-a cumulative distribution function (CDF) is a non-decreasing function F defined as:
F(x) = P(X≤x)
-and satisfies:
lim x->-∞ F(x) = 0
and:
lim x->+∞ F(x) = 1
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Independence of Random Variables

Definition

A

-two random variables X and Y are independent if every event expressible in terms of X alone is independent of every other event expressible in terms of Y alone, in particular:
P(X≤x & Y≤y) = P(X≤x)P(Y≤y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Independence of Discrete Random Variables

Definition

A

-for two independent discrete random variables X[Y] taking on values xi[yi], then:
P(X=xi & Y=yi) = P(X=xi)P(Y=yi)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Bernoulli Random Variables

A

-take on only two values, 1 or 0
p(x) = p, if x=1 1-p if x=0, 0, else
-a useful representation is:
p(x) = p^x (1-p)^(1-x) if x=1 or x=0, 0 else

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are bernoulli random variables used for?

A

-let A⊂Ω be an event in the sample space Ω, let:
X(ω) = {1, ω∈A 0, else}
-then X is an indicator random variable that takes on value 1 if A does occur and 0 otherwise
-bernoulli random variables often represent success vs. failure of an experiment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Binomial Distribution

A

-experiment performed n times
-each trial is independent of the others
-assume each experiment results in success with probability p (i.e. each is described by a Bernoulli random variable)
-the random variable X=ΣYj denoting the number of successes in the n independent Bernoulli trials has a binomial distribution
p(k) = P(X=k) = nCk p^k (1-p)^(n-k)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Binomial Theorem

A

(X + Y)^m = Σ mCk X^k Y^(m-k)
1 = 1^m = (p + (1-p))^m
= Σ mCk p^k (1-p)^(m-k)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Geometric Distribution

A

-sequence of independent Bernoulli trials performed, no upper bound
-random variable X denoting the number of trials that must be performed before a success occurs has a geometric distribution:
p(k) = P(X=k) = p(1-p)^(k-1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Negative Binomial Distribution

A

-same assumptions as the geometric distribution, a sequence of bernoulli trials with no upper bound
-the random variable X denoting the number of trials required until the rth success has a negative binomial distribution
-event X=k happens when in the first k-1 trials there were exactly r-1 succeses and on the kth trial, there was also a success:
p(k) = P(X=k)
= (k-1)C(r-1) p^r (1-p)^(k-r)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Poisson Distribution

A

-can be derived as the limit of the binomial distribution
-consider a binomial distribution with large n and small p
-let λ=np
-let n->∞ and p->0 such that λ remains constant
P(X=k) = λ^k e^(-λ) / k!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Expectation of a Discrete Random Variable

A

E(X) = Σ k*px(k) = Σ kP(X=k)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Linearity of Expectation

A

-the expectation of the sum of random variables is equal to the sum of their expectations:
E(Σ Xi) = Σ (E(Xi))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Expectation Lemma

A

-for any constant c:

E(cX) = cE(X)

17
Q

Probability Density Function

Definition

A
  • the role of the probability mass function is taken on by the density function f(x) which satisfies:
    i) f(x) ≥ 0
    ii) f(x) is piecewise continuous
    iii) ∫ f(x) dx = 1, integration between -∞ and +∞
    iv) Pr(a
18
Q

What is the probability that a continuous random variable takes on a particular single value?

A

Pr(c

19
Q

Fundamental Theorem of Calculus

A

-if f(x) is continuous at x, f(x)=F’(x)
=> Pr(a≤X≤b) = ∫ f(x)
= F(b)-F(a)
-where the integration is from a to b

20
Q

Quantiles

A
  • the pth quantile of F is defined to be the value xp such that F(xp)=p
  • for p=1/2, xp is the median
  • for p=1/4, xp is the lower quartile
  • for p=3/4, xp is the upper quartile
21
Q

Standard Uniform Distribution

A

-if we pick a random number in an in the interval [0,1] we are describing:
f(x) = 1, 0≤x≤1 0, x<0 x>1
-CDF:
F(x) = 0, x≤0 x, 0≤x≤1 1 x≥1

22
Q

Uniform Distribution

A

-pdf

f(x) = 1/(b-a), a≤x≤b 0, x<a>b</a>

23
Q

Exponential Distribution

A

-like the poisson distribution, depends on only one parameter
-pdf:
f(x) = {λe^(-λx), x≥0 0 x<0
-cdf:
F(x) = {1-e^(-λx), x≥0 1, x<0
-used to model lifetimes and waiting times

24
Q

Normal Distribution

A
often used as a generic random variable
-bell-shaped curve
X~N(µ,σ²)
-pdf:
f(x) = 1/σ√2π * e^[-(x-µ)²/2σ²]
-standard normal:
Z~N(0,1)
25
Q

Normal Distribution

Y=aX+b Proposition

A

-suppose X~N(µ,σ²)
-if Y=aX+b
-then Y is normally distributed as well:
Y~N(aµ+b, a²σ²)

26
Q

Normal Distribution

Z=F(X) Proposition

A
-let Z=F(X) where X is any random variable, then Z has a uniform distribution on [0,1]:
Z~U(0,1)
27
Q

Normal Distribution

X=F^(-1)(U) Proposition

A

-let U be uniform on [0,1] and let X = F^(-1)(U), then the cumulative density function of X is F