Topic 4 - Discrete Random Variables Flashcards

1
Q

What is P(X = x) formally known as

A
  • Probability distribution function or probability mass function
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does the probability distribution/mass function show

A
  • The probabilities for all the possible outcomes
  • Can be shown algebraically, graphically or in a table
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is E(x) for a discrete random variable

A
  • sum of x * P(x)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the cumulative distribution function denoted as

A
  • F(x0)
  • P(X <= x0)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the variance for a discrete random variable

A
  • E(X - mu)^2 = sum of (x - mu)^2 * P(x)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the standard deviation of a discrete random variable

A
  • sqrt of sum of (x - mu)^2 * P(x)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are E(a) and Var(a) equivalent to

A
  • E(a) = a
  • Var(a) = 0
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are E(bX) and Var(bX) equivalent to

A
  • E(bX) = b * E(X)
  • Var(bX) = b^2 * Var(X)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

If Y = a + bX what are E(Y) and Var(Y) equivalent to

A
  • E(a + bX) = b * E(X) + a
  • Var(a + bX) = b^2 * Var(X)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is a bernoulli distribution

A
  • A bernoulli random variable X can be thought of as an indicator of success taking two values
  • X = 1, succes, X = 0 if failure
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

When is a random variable said to have a bernoulli distribution

A
  • if P(X = 1) = p and P(X = 0) = 1-p
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is E(X) and Var(X) for a bernoulli distribution

A
  • E(X) = p
  • Var(X) = p(1 - p)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the binomial distribution formula

A
  • P(X = k) = n! / k!(n-k)! * p^k * (1-p)^n-k
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How is the binomial distribution denoted

A
  • Bin(n,p)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does the binomial distribution look like to be considered a bernoulli distribution

A
  • Bin(1,p)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is E(X) and Var(X) of a binomial distribution

A
  • E(X) = np
  • Var(X) = np(1-p)
17
Q

When would a poisson distribution be used

A
  • Where we are counting the number of success in a particular space or interval of time
18
Q

What are the 4 assumptions of a poisson distribution

A
  • An event can occur any number of times in a given time period
  • The probability of an event occuring is proportional to the length of the time period
  • The rate of occurance is constant
  • Events occur independently
19
Q

What is the poisson distribution formula

A
  • P(X = k) = e^-λ * λ^k / k!
20
Q

What is lamda in a poisson distribution

A
  • A constant that specifies the the average of occurances for a particular time/space
21
Q

What is E(X) and Var(X) for a poisson distribution

A
  • E(X) = λ
  • Var(X) = λ
22
Q

How is the poisson distribution denoted

A
  • Po(λ)
23
Q

What is the general rule for a poisson rule when changing time intervals

A
  • if X ~ Po(λ) on 1 unit interval then Y ~ Po(kλ) on k unit intervals
24
Q

When can the poisson distriubtion be used to approximate binomial probabilities

A
  • When the number of trials n is large
  • When the probability p is small
  • When np is moderate (preferably np <= 7)
25
Q

Show mathmatically how the poisson approximates the binomial

A
  • Bin(n,p) -> λ = np
  • Po(np)
  • P(X = k) = e^ -np * (np)^k / k!
26
Q

What is a joint probability mass function (PMF)

A
  • Used to express the probability that X takes the specific value x and simultaneously Y takes the value y, as a function of x and y
27
Q

How can a joint probability mass function be represented mathmatically

A
  • P(x,y) = P(X = x n Y = y)
  • where n is the intersection
28
Q

How are the marginal probability mass functions, P(X) and P(Y), calculated using the joint function P(x,y)

A
  • P(X) = sum of P(x,y) for all y
  • P(Y) = sum of P(x,y) for all x
29
Q

How do you calculate the conditional probability mass function using marginal and joint functions

A
  • P(y | x) = P(x,y) / P(x)
  • P(x | y) = P(x,y) / P(y)
30
Q

When are jointly distributed random variables, x and y, said to be independent

A
  • P(x,y) = P(x) * P(y)
31
Q

How is covariance calculated for joint variables x and y (discrete random variables)

A
  • covariance is the expected value of (X - mu of x) * (Y - mu of y)
  • Cov(X,Y) = E[(X - mu of x)(Y - mu of y)] = sum of all of sum of all y of (x-mu of x)(y-mu of y) * P(x,y)
32
Q

What does covariance measure

A
  • Measures the strength of the linear relationship between two variables
  • If the two variables are statistically independent the covariance between them is 0
33
Q

How is the correlation between X and Y calculated

A

ρ = Corr(X,Y) = Cov(X,Y) / S.D of x * S.D of y

34
Q

What do different values of ρ indicate

A
  • ρ = 0 no linear relationship between X and Y
  • ρ > 0 positive linear relationship between X and Y
  • ρ < 0 negative linear relationship between X and Y
35
Q

If W = aX + bY, what are E(W) and Var(W) for portfolio analysis

A
  • E(W) = E(aX + bY) = a * mu of x + b * mu of y
  • Var(W) = Var(aX + bX) = a^2 * Var(X) + b^2 * Var(Y) + 2 * a * b * Cov(X,Y)
  • Var(W) = Var(aX + bX) = a^2 * Var(X) + b^2 * Var(Y) + 2 * a * b * S.D of x * S.D of y * Corr(X,Y)