Exam 2 (pt. 1) Flashcards

Chapter 4 Content

1
Q

A random variable X assigns a numerical value X(s) to . . .

A

each possible outcome s of the experiment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

The source of the randomness in a random variable is _______________, in which a sample outcome s ∈ S is chosen
according to a ___________

A

the experiment itself
probability function P

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

A random variable X is said to be discrete if there is a finite list of values a1, a2, . . . , an or an infinite list of values a1, a2, … such that . . .

A

P(X = aj) ∈ [0, 1]
∑j P(X = aj) = 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

If X is a discrete r.v., then the finite or countably infinite set of values x such that P(X = x) > 0 is called the ______ of X.

A

support

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

The _____________ specifies the probabilities of all events associated with the r.v

A

distribution of a random variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The probability mass function (PMF) of a discrete r.v. X is the function pX given by _________. Note that this is ________
if x is in the support of X, and _________ otherwise.

A

pX (x) = P(X = x)

positive, zero

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

The cumulative distribution function (CDF) of an r.v. X is the function F(X) given by __________

A

F(x) = P(X ≤ x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

The expected value of a discrete r.v. X whose distinct possible values are x1, x2, . . ., is defined by . . .

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

The _____________ of X is a weighted average of the possible values that X can take on, weighted by their probabilities

A

expected value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

For any r.v.s X, Y and any constant c there are 4 primary manipulations to know for the E(X):
1. E(c) =
2. E(X + Y) =
3. E(X + c) =
4. E(cX) =

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

The variance of an r.v. X is ________

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

For any r.v.s X, Y and any constant c there are 4 primary manipulations to know for the Var(X):
1. Var(X + c) =
2. Var(cX) =
3. Var(X + Y) =
4. Var(X) >= 0 when . . .

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

An r.v. X is said to have the Bernoulli distribution with
parameter p if . . .

A

P(X = 1) = p and P(X = 0) = 1 − p
where 0 < p < 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

A Bernoulli distribution is notated as . . .

A

X ∼ Bern(p)
where p = P(X = 1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

An experiment that can result in either a “success” or a “failure” (but not both) is called a ___________

A

Bernoulli trial

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the PMF of a Bernoulli distribution?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is the CDF of a Bernoulli distribution?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Any r.v. whose possible values are 0 and 1 has a ___________, with p the probability of the r.v. equaling 1. This number p is called the ___________ the distribution

A

Bern(p) distribution
parameter

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Let X ∼ Bern(p). Then E(X) = ________

A

E(X) = 1 · p + 0 · (1 − p) = p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Let X ∼ Bern(p). Then Var(X) = ________

A

Var(X) = p(1 − p)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Suppose that n independent Bernoulli trials are performed, each with the same success probability p. Let X be the number of successes. The distribution of X is called the ________ distribution

A

Binomial

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

A Binomial distribution is notated as . . .

A

X ∼ Bin(n, p)
parameters n and p, where n is the number of trials and p is the success probability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What is the PMF of a Binomial distribution?

A
24
Q

Let X ∼ Bin(n,p). Then E(X) = ________

A

E(X) = np

25
Q

Let X ∼ Bin(n,p). Then Var(X) = ________

A

Var(X) = np(1 − p)

26
Q

If X ∼ Bern(p), then by definition, X ∼ Bin(____)

A

(1, p)

27
Q

Let X ∼ Bin(n, p). How do we denote the
failure probability of a Bernoulli trial? Then the distribution becomes _____

A

q = 1 - p
X ∼ Bin(n, q) to denote failures

28
Q

Let X ∼ Bin(4, 1/2). How do you find the CDF F(X= 1.5) such that P(X ≤ 1.5)?

A

Sum the PMF over all values of the support that are less than or equal to 1.5

29
Q

Let X ∼ Bin(4, 1/2). What is P(X > 1)?

A
30
Q

Consider a sequence of independent Bernoulli trials, each with
the same success probability p ∈ (0, 1), with trials performed until a success occurs. Let X be the number of failures before the first successful trial. Then X has the_____________

A

Geometric distribution

31
Q

A Geometric distribution is notated as . . .

A

X ∼ Geom(p)
with p as the success probability

32
Q

What is the PMF of a Geometric distribution starting at 0?

A
33
Q

Let X ∼ Geom(p). Then E(X) = ________

A

E(X) = (1 - p)/p

34
Q

Let X ∼ Geom(p). Then Var(X) = ________

A

Var(X) = (1 - p)/p^2

35
Q

In a sequence of independent Bernoulli trials with success probability p, if X is the number of failures before the rth
success, then X is said to have the ______

A

Negative Binomial distribution

36
Q

A Negative binomial distribution is notated as . . .

A

X ∼ NBin(r, p)
where r is the desired number success and p is the probability of success

37
Q

What is the PMF of a Negative binomial distribution?

A
38
Q

Let X ∼ NBin(r,p). Then E(X) = ________

A

E(X) = (1 − p)/p

39
Q

Let X ∼ NBin(r,p). Then Var(X) = ________

A

Var(X) = r(1 - p)/p^2

40
Q

An r.v. X has the __________ with parameter λ if it explains the distribution of . . .
1. the number of successes in a particular region or _________
2. a large number of trials, each with a ___________.

A

Poisson distribution
interval of time
small probability of success

41
Q

What is the PMF of a Poisson distribution?

A
42
Q

A Poisson distribution is notated as . . .

A

X ~ Pois( λ)
Where parameter λ is interpreted as the rate of occurrence of these rare events

43
Q

Let X ~ Pois(λ). Then E(X) = _________

A

E(X) = λ

44
Q

Let X ~ Pois(λ). Then Var(X) = _________

A

Var(X) = λ

45
Q

The Poisson paradigm is also called the law of rare events. The interpretation of “rare” is that the _____ are small, not that _______ is small.

A

p
λ

46
Q

Given X ~ Bin(n,p) where n is large, p is small, and np is moderate, we can approximate the Bin(n, p) PMF by the
_____________

A

Pois(np) PMF

47
Q

Let C be a finite, nonempty set of numbers. Choose one of these numbers uniformly at random (i.e., all values in C are equally likely). Call the chosen number X. Then X is said to
have the _________________

A

Discrete Uniform distribution

48
Q

A Discrete Uniform Distribution is notated as . . .

A

X ~ Unif(C)
Where parameter C is the finite, nonempty set of numbers

49
Q

What is the PMF of a Discrete Uniform distribution?

A
50
Q

Let X ~ Unif(C). Then E(X) = __________

A

(a + b)/2
where a is the minimum of the set C and b is the maximum

51
Q

Let X ~ Unif(C). Then Var(X) = __________

A

where a is the minimum of the set C and b is the maximum

52
Q

For an experiment with sample space S, an r.v. X, and a function g : R → R, g(X) is the r.v. that maps s to _____________

A

g(X(s)) for all s ∈ S

53
Q

Given a discrete r.v. X with a known PMF, how can we find the PMF of Y = g(X)?

A
54
Q

Random variables X and Y are said to be independent if . . .

A

P(X ≤ x, Y ≤ y) = P(X ≤ x)P(Y ≤ y)

or for discrete cases
P(X = x, Y = y) = P(X = x)P(Y = y)

55
Q

If X ∼ Bin(n, p), Y ∼ Bin(m, p), and X is independent of Y, then . . .

A

X + Y ∼ Bin(n + m, p)

56
Q

Random variables X and Y are said to be conditionally independent given an r.v. Z if for all x, y ∈ R and all z in the
support of Z, if . . .

A

P(X ≤ x, Y ≤ y | Z = z) = P(X ≤ x | Z = z)P(Y ≤ y | Z = z)

or for discrete cases
P(X = x, Y = y | Z = z) = P(X = x | Z = z)P(Y = y | Z = z)

57
Q

For any discrete r.v.s X and Z, the function P(X = x|Z = z),
when considered as a function of x for fixed z, is called the
_________________ of X given Z = z.

A

conditional PMF