Chapter 5: Joint Probability Distributions Flashcards

1
Q

Joint Probability Mass Function for Discrete Variables

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Joint Probability Mass Function for Continuous Variables

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Joint Probability Density Function for Continuous Variables

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Example of Joint Probability Density

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Marginal Densities in example

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Independent Random Variables

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Independent Random Variables Explained

A
  • The definition says that two variables are independent if their joint pmf or pdf is the product of the two marginal pmf’s or pdf’s.
  • Intuitively, independence says that knowing the value of one of the variables does not provide additional information about what the value of the other variable might be.
  • Independence of X and Y requires that every entry in the joint probability table be the product of the corresponding row and column marginal probabilities.
  • Independence of two random variables is most useful when the description of the experiment under study suggests that X and Y have no effect on one another.
    • Then once the marginal pmf’s or pdf’s have been specified, the joint pmf or pdf is simply the product of the two marginal functions. It follows that
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Conditional Distributions

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Expected Value

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Covariance

A
  • Thus, the covariance of and provides a measure of the degree to which two variables X and Y tend to “move together”:
    • a positive covariance indicates that the deviations of X and Y and from their respective means tend to have the same sign;
    • a negative covariance indicates that deviations of X and Y from their respective means tend to have opposite signs.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Correlation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Correlation Coefficient

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Properties of the Correlation Coefficient

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Joint PMF of Two Discrete Random Variables

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Marginal PMF of Two Discrete RVs

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Example: Joint Probability Distribution of 2 Discrete RVs

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Example: Joint Probability Distribution of 2 Discrete RVs (contd.)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Finding Marginal Distributions from Previous Example

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Two Continuous Random Variables

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

If A is the two-dimensional rectangle

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

If A is the two-dimensional rectangle

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Example 3

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Example 3 contd.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Marginal PDF of Two Continuous RVs
26
Joint Probability Distributions - Expected Values
27
28
Conditional Distributions
29
Conditional PDF and PMF
30
Example 12
31
Example 12: Conditional Probability Function
32
* The probability that the walk-up facility is busy at most half the time given that X = .8 is then
33
•The expected proportion of time that the walk-up facility is busy given that X = .8 (a conditional expectation) is,
34
Statistics and Their Distributions
35
Statistics and Their Distributions (contd.)
36
Example 19
37
Example 19 contd.
38
Example 19 - Table 5.1
* Notice first that the 10 observations in any particular sample are all different from those in any other sample. * Second, the 6 values of the *sample mean* are all different from one another, as are the 6 values of the *sample median* and the 6 values of the *sample standard deviation*. * The same is true of *the sample 10% trimmed means*, *sample fourth spreads*, and so on. * Furthermore, the value of the sample mean from any particular sample can be regarded as a **_point estimate_** (“point” because it is a single number, corresponding to a single point on the number line) of the **_population mean mew_**, whose value is known to be 4.4311. * None of the estimates from these six samples is identical to what is being estimated. * The estimates from the second and sixth samples are much too large, whereas the fifth sample gives a substantial underestimate. * Similarly, the **_sample standard deviation_** gives a point estimate of the **_population standard deviation_**. * All 6 of the resulting estimates are in error by at least a small amount
39
Example 19 Summary
In summary, * the values of the individual sample observations vary from sample to sampleèso will, in general, the value of any quantity computed from sample data; and * the value of a sample characteristic used as an estimate of the corresponding population characteristic will virtually never coincide with what is being estimated.
40
Statistics and Their Distributions
* A **_statistic_** is any quantity whose value can be calculated from sample data. * Prior to obtaining data, there is uncertainty as to what value of any particular statistic will result. * Therefore, a _statistic is a random variable_ and will be denoted by a uppercase letter; a lowercase letter is used to represent the calculated or observed value of the statistic. ​
41
Statistics and Their Distributions (contd.)
42
Random Samples: The **Probability Distribution** of any particular statistics depends on:
1. the population distribution 2. the sample size *n* 3. **_the method of sampling_**
43
**_Simple Random Sample_**
* Advantage of *simple random sampling* method: * probability distribution of any statistic can be more easily obtained than for any other sampling method
44
Deriving the Sampling distribution of a statistic:
Two general methods for obtaining information about a statistic’s sampling distribution: 1. Using calculations based on probability rules 2. By carrying out a simulation experiment.
45
**Deriving a Sampling Distribution:** Probability Rules
1. Probability rules can be used to obtain the distribution of a statistic provided that : * it is a “fairly simple” function of the Xi’s and either there are relatively few different X values in the population ; **_OR_** * the population distribution has a “nice” form. Typical procedure (see Example 20): (i) construct all possible samples of the desired sample size, (ii) calculate the value of the statistic of interest for each sample and associated probability (iii) construct the pmf of the statistic of interest.
46
**_Deriving a Sampling Distribution:_** Simulation Experiments
* This method is usually used when a derivation via probability rules is too difficult or complicated to be carried out. * Such an experiment is virtually always done with the aid of a computer.
47
**_Deriving a Sampling Distribution_**: Simulation Experiments
48
**_Simulation Experiment:_** Density curve of Normal rv X with mean=8.25 and s.d. = 0.75
49
Simulation Experiment Histograms
Notable features of the histograms showing the sampling distribution of the mean: * **_Shape_** --\>All reasonably normal * **_Center--\>_** Each histogram approximately centered at mew= 8.25. * **_Spread--\>_**The larger the value of n, the more concentrated is the sampling distribution about the mean value. This is why the histograms for n = 20 and n = 30 are based on narrower class intervals than those for the two smaller sample sizes.
50
Simulation Experiment Contd.
51
The Distribution of the Sample Mean
52
Properties of **_Sample Mean_** and **_Sample Sum_**
53
Example 25
54
Example 25 contd.
55
The Central Limit Theorem
56
Central Limit Theorem (contd.)
57
Convergence of means from U[-1,1] to a normal shape
58
Histograms of raw means of samples from U[-1,1]
59
Apply Central Limit Theorem
60
Example 26
61
Implications of CLT
62
Linear Combinations and their means
63
Variances of linear combinations
64
The Difference between Random Variables
65
The Case of Normal Random Variables