Class Test 1 Flashcards
General formula, E(Y), V(Y) for binomial?
P(Y=y) = (n y)p^y.q^(n-y) E(Y) = np V(Y) = npq
General formula, E(Y), V(Y) for geometric?
p(y) = q^(y-1).p E(Y) = 1/p V(Y) = (1-p)/p^2
What is y in geometric distribution?
The number trial first success occurs
P(Y>y) for geometric distribution?
q^y
General formula, E(Y), V(Y) for negative binomial?
p(y) = (y-1 r-1).p^r.q^y-r E(Y) = r/p V(Y) = r(1-p)/p^2
General formula, E(Y), V(Y) for poisson?
P(Y=y) = (e^-λ.λ^y)/y! E(Y) = λ V(Y) = λ
General formula, E(Y), V(Y) for uniform?
f(y) = 1/(θ2-θ1) for y between theta
0 elsewhere
E(Y) = (θ1+θ2)/2
V(Y) = ((θ2-θ1)^2)/12
General formula, E(Y), V(Y) for normal?
Z=(Y-μ)/σ
E(Y) = μ
V(Y) = σ^2
General formula, E(Y), V(Y) for gamma?
See notes for GF
E(Y) = αβ V(Y) = αβ^2
General formula, E(Y), V(Y) for chi-square?
Gamma formula with α=v/2 and β=2
E(Y)=v
V(Y)=2v
What is chi square distribution used for?
Determining likelihood that an observer distribution is due to chance
General formula, E(Y), V(Y) for exponential distribution?
Gamma formula with α=1
Tf f(y) = (1/β)e^(-y/β) (see notes)
E(Y) = β V(Y) = β^2
What is definition 4.13 regarding moments? (Both parts and equations)
If Y is a CRV, then the kth moment about the origin is given by:
μ’k = E(Y^k) k=1,2…
The kth moment about the mean, or the kth central moment, is given by:
μk = E[(Y-μ)^k] k=1,2…
(For k=1, μ’1=μ, and for k=2, μ2=V(Y))
How many ways are there to place ‘n’ distinct objects in a row?
Permutation since order matters, tf n!
Number of ways to select 4 balls without replacement from a container with 15 distinct balls?
15choose4=1365
Given 10 maths teachers, need to choose 3 for a committee, what is the probability that Mr A, B and C are chosen?
(number of ways ABC can be selected)/total number of combos of maths teachers
=1/(10choose3)=1/120
See and learn
Table on other side of permutations notes
Define population?
The large body of data that is the target of our interest?
What is a sample?
A subset of the population
Define the following:
1) experiment
2) event
3) simple event
4) compound event
5) sample space
1) the process by which an observation is made
2) the outcome of an experiment
3) an event that cannot be decomposed
4) an event that can be decomposed into simple events
5) set of all possible sample points
What is a discrete sample space?
A SS with a finite/countably finite number of distinct sample points
What is the mn rule?
With m elements (a1…am) and n elements (b1…bn) it is possible to form m.n number of pairs containing one element from each group
Define permutation?
The number of ways of ordering n distinct objects taken r at a time (Pnr)
What does Cnr stand for?
The number of combinations of n objects taken r at a time is the number of subsets, each of size r, that can be formed from the n objects
How to tell if events A and B are independent?
If P(AnB)=P(A).P(B) they are independent
See
L4 partitions definition
Define a random variable?
A real-valued function for which the domain is a sample space
What defines if a sample is a random sample?
If sampling is conducted such that each of the samples has an equal probability of being selected
4 properties of a binomial distribution?
Fixed number (n) of trials
Each trial measures success or failure
Success=p and is constant, failure=1-p=q
Trials are independent
The RV Y is the number of successes during n trials
Explain the negative binomial probability distribution?
Same layout at binomial except:
Y is the number trial at which the rth success occurs
First 2 moments definitions?
1) The kth moment of a RV Y taken about the origin is defined to be E(Y^k) and is denoted μ’k (ie. the mean)
2) The kth moment of a RV Y taken about its mean is defined to be E((Y-μ)^k) and is denoted μk
Why are probability mass functions called so?
Because they give the probability (mass) assigned to each of the finite or countably finite possible values for these DRVs
What are distribution functions for DRVs always?
step functions
What is a cumulative distribution function?
for Y:
F(y), such that F(y)=P(Y<=y) for all y
What defines a continuous distribution function?
A RV Y is said to be continuous if F(y) is continuous for all of y
What is the P(Y=0) for a continuous function?
0
If F(y) is the distribution function for a CRV Y, what is f(y)?
The probability density function of Y (see graph L8)
See
L8 properties of a density function and bit below
How do you define the distribution of a RV Y?
A RV Y is said to have a cont./discrete x distribution on y interval iff the density function is:
z
When might we use a gamma probability distribution and why?
If the RV is nonnegative tf produce positive distributions skewed to the right (draw diagram)
eg. wage data, size of firms etc.
See
Gamma notes L9
See
L10 on chi square and exponential distributions
What do multivariate probability distributions allow us to do?
Find out information on the intersection of events
For any RVs Y1 and Y2, the joint (bivariate) distribution is…
F(y1,y2) = P(Y1<=y1, Y2<=y2) (see L10 notes on this, all of it!!!)
Whene are 2 RVs said to be jointly continuous?
If their joint distribution F(y1,y1) is continuous in both arguments
What is R when doing double integrals, and what does the double integral give?
R=region of integration
It gives the volume under the surface z=f(x,y)
3 steps to working out the double integral?
1) Work out the limits of integration
2) Work out the inner integral
3) Work out the outer integral
See
Note 1 and 2 L11
Define marginal probability functions for discrete RVs?
If Y1 and Y2 are jointly discrete RVs with probability function p(y1,y2), then the marginal probability functions of Y1 and Y2 respectively are given by:
p1(y1) = (all y2)Σp(y1,y2)
p2(y2) = (all y1)Σp(y1,y2)
Define marginal probability functions for continuous RVs?
If Y1 and Y2 are jointly continuous RVs with joint density function f(y1,y2), then the marginal density functions of Y1 and Y2 respectively are given by:
f1(y1) = (-∞->∞)∫f(y1,y2)dy2
f2(y2) = (-∞->∞)∫f(y1,y2)dy1
See
Conditional distributions bottom of L11 (v important dont get it yet)
See
top of L12 definition and both theorems
See
the expected value of a RV (CRV and DRV)
If Y1 and Y2 are RVs with means μ1 and μ2, the covariance of Y1 and Y2 is?
Cov(Y1,Y2) = E[(Y1-μ1)(Y2-μ2)] = E(Y1Y2)-E(Y1)E(Y2)
What does a 0 covariance indicate?
No linear dependence between Y1 and Y2
NOTE: the opposite is not true; uncorrelated variables may not be independent
Problem with covariance? Solution to this?
It isn’t an absolute measure of dependence because its value depends upon the scale of measurement.
Solution: standardize its value by using the correlation coefficient:
ρ=Cov(Y1,Y2)/(σ(Y1).σ(Y2))
Given 2 IRVs: Cov(Y1,Y2)=0?
Tf IRVs must also be uncorrelated
See bottom of L12 side 2
now
What is the conditional expectation of Y1, given Y2=y2?
The weighted average of the values that Y1 can take on, where each possible value is weighted by its conditional probability
SEE conditional expectations equations top of L13
Law of iterated expectations?
E(Y1)=E[E(Y1|Y2)]
What are mean independent variables?
Variables that are uncorrelated but may not be independent
eg. if E(Y1|Y1)=E(Y1)
Note
Independence means mean independence means uncorrelated BUT not other way round
See notes
Probability distribution function of an RV (don’t get)
What is a statistic?
A function of the observable RVs in a sample and known observations
Make cards on L14
now
What is an estimator?
A rule, often expressed as a formula, that tells how to calculate the value of an estimate based on the measurement in a sample
Difference between an estimator and a statistic?
An estimator is a statistic related to a particular parameter