Stats Review Flashcards

1
Q

Random Variable

A

Variable whose values we do not know with certainty

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Probability

A

A function defined over the possible values a random variable can take on.

Between 0 and 1

Add up to one

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Discrete Random Variable

A

Finite or countably infinite

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

probability density function (pdf) Discrete

A

Summarizes the information concerning the possible outcomes of the random variable and the corresponding probabilities:

f (xj) = pj, j = 1, …, k.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

continuous random variable

A

Any real value with zero probability, so many possible values that we cannot count them or match them
up with the positive integers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

PDF Continuous

A

P(a ≤ X ≤ b) = ∫fX(x)dx

continuous random variable X as fX (x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

cumulative distribution function

A

F(x) ≡P(X ≤ x)

  • For discrete variables, the cdf is obtained by summing the pdf over all values xj such that
    xj ≤ x. For a continuous random variable, the cdf is the area under the pdf, fX, to the left of
    the point x, that is F(x) = ∫ x−∞ fX (x)dx.
  • Because F(x) is a probability, it is always between 0 and 1
  • For x1 ≤ x2, F(x1) ≤ F(x2), that is F(x) is an increasing function of x
  • For any number c, P(X > c) = 1 − F(c)
  • For any numbers a < b, P(a < X ≤ b) = F(b) − F(a)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Expected Value

A

a measure of central ten-
dency of its probability distribution.

a weighted average
of all possible values of X, where the weights are determined by the pdf

The expected value is also called population mean.

The expected value can sometimes be denoted by μX or μ.

For any constant c, E(cX) = cE(X).
For any constants a and b, E(aX + b) = aE(X) + b.

If {a1, a2, …, an} are constants and {X1, X2, …, Xn} are random variables, then
E(a1 X1 + a2 X2 + … + an Xn) =a1 E(X1) + a2 E(X2) + … + an E(Xn).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

E(X) Finite

A

E(X) =x1 f (x1) + x2 f (x2) + … +xk f (xk) ≡ ∑xj f (xj)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

E(X) continuous

A

If X is a continuous random variable and fX (x) is its pdf, then:
E(X) = ∫xf (x)dx

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Variance

A

How far X is from its mean μ, on average

Var(X) ≡E[(X − μ)^2]

Denoted by σ^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Standard Deviation

A

σ, sqrt(var)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

standardized random variable

A

Z ≡ (X − μ)/σ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Joint Distribution

A

F(x, y) ≡P(X ≤ x, Y ≤ y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

X and Y are said to be independent iff

A

fX,Y (x, y) = fX (x) fY (y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Conditional Distribution

A

The joint distribution tells us the probability that certain values of two (or more) random variables are observed concurrently.

fY|X (y|x) = (fX,Y (x, y))/(fX (x))

16
Q

Covariance

A

The covariance measures the extent of the linear association between two random variables. If the
covariance is positive (negative), the two variables move in the same (opposite) directions.

Cov(X, Y) ≡E[(X − μX )(Y − μY )]

σXY

If X and Y are independent Cov(X,Y) = 0, but if Cov(X,Y) = 0 that does not imply that X and Y are independent

17
Q

Correlation Coefficient

A

Corr(X,Y) = σ^2XY/σX σY

18
Q

Sums of random variables

A

Var(aX + bY) =a^2Var(X) + b^2Var(Y) + 2abCov(X, Y)

19
Q

Conditional Expectations

A

(Y|x) =∑yj fY|X (yj|x) (Discrete)

E(Y|x) = ∫y fY|X (y|x)dy (Continuous)

20
Q

law of iterated expectations

A

E[E(Y|X)] = E(Y)

21
Q

Estimator

A

A function of the sample outcomes that generates a value for θ

22
Q

Ybar Estimate

A

1/n ∑Yi

23
Q

S^2

A

S^2 = 1/n − 1∑(Yi − Y)^2

24
Q

SXY

A

1/n − 1∑(Xi − X)(Yi − Y)

25
Q

T-Test

A

T =√n(Ybar − μ0)/S