Class Test 1 Flashcards

1
Q

General formula, E(Y), V(Y) for binomial?

A
P(Y=y) = (n  y)p^y.q^(n-y)
E(Y) = np
V(Y) = npq
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

General formula, E(Y), V(Y) for geometric?

A
p(y) = q^(y-1).p
E(Y) = 1/p
V(Y) = (1-p)/p^2
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is y in geometric distribution?

A

The number trial first success occurs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

P(Y>y) for geometric distribution?

A

q^y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

General formula, E(Y), V(Y) for negative binomial?

A
p(y) = (y-1  r-1).p^r.q^y-r
E(Y) = r/p
V(Y) = r(1-p)/p^2
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

General formula, E(Y), V(Y) for poisson?

A
P(Y=y) = (e^-λ.λ^y)/y!
E(Y) = λ
V(Y) = λ
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

General formula, E(Y), V(Y) for uniform?

A

f(y) = 1/(θ2-θ1) for y between theta

0 elsewhere

E(Y) = (θ1+θ2)/2

V(Y) = ((θ2-θ1)^2)/12

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

General formula, E(Y), V(Y) for normal?

A

Z=(Y-μ)/σ
E(Y) = μ
V(Y) = σ^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

General formula, E(Y), V(Y) for gamma?

A

See notes for GF

E(Y) = αβ
V(Y) = αβ^2
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

General formula, E(Y), V(Y) for chi-square?

A

Gamma formula with α=v/2 and β=2

E(Y)=v
V(Y)=2v

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is chi square distribution used for?

A

Determining likelihood that an observer distribution is due to chance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

General formula, E(Y), V(Y) for exponential distribution?

A

Gamma formula with α=1

Tf f(y) = (1/β)e^(-y/β) (see notes)

E(Y) = β
V(Y) = β^2
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is definition 4.13 regarding moments? (Both parts and equations)

A

If Y is a CRV, then the kth moment about the origin is given by:

μ’k = E(Y^k) k=1,2…

The kth moment about the mean, or the kth central moment, is given by:

μk = E[(Y-μ)^k] k=1,2…

(For k=1, μ’1=μ, and for k=2, μ2=V(Y))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How many ways are there to place ‘n’ distinct objects in a row?

A

Permutation since order matters, tf n!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Number of ways to select 4 balls without replacement from a container with 15 distinct balls?

A

15choose4=1365

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Given 10 maths teachers, need to choose 3 for a committee, what is the probability that Mr A, B and C are chosen?

A

(number of ways ABC can be selected)/total number of combos of maths teachers

=1/(10choose3)=1/120

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

See and learn

A

Table on other side of permutations notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Define population?

A

The large body of data that is the target of our interest?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is a sample?

A

A subset of the population

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Define the following:

1) experiment
2) event
3) simple event
4) compound event
5) sample space

A

1) the process by which an observation is made
2) the outcome of an experiment
3) an event that cannot be decomposed
4) an event that can be decomposed into simple events
5) set of all possible sample points

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What is a discrete sample space?

A

A SS with a finite/countably finite number of distinct sample points

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What is the mn rule?

A

With m elements (a1…am) and n elements (b1…bn) it is possible to form m.n number of pairs containing one element from each group

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Define permutation?

A

The number of ways of ordering n distinct objects taken r at a time (Pnr)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What does Cnr stand for?

A

The number of combinations of n objects taken r at a time is the number of subsets, each of size r, that can be formed from the n objects

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

How to tell if events A and B are independent?

A

If P(AnB)=P(A).P(B) they are independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

See

A

L4 partitions definition

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Define a random variable?

A

A real-valued function for which the domain is a sample space

28
Q

What defines if a sample is a random sample?

A

If sampling is conducted such that each of the samples has an equal probability of being selected

29
Q

4 properties of a binomial distribution?

A

Fixed number (n) of trials
Each trial measures success or failure
Success=p and is constant, failure=1-p=q
Trials are independent

The RV Y is the number of successes during n trials

30
Q

Explain the negative binomial probability distribution?

A

Same layout at binomial except:

Y is the number trial at which the rth success occurs

31
Q

First 2 moments definitions?

A

1) The kth moment of a RV Y taken about the origin is defined to be E(Y^k) and is denoted μ’k (ie. the mean)
2) The kth moment of a RV Y taken about its mean is defined to be E((Y-μ)^k) and is denoted μk

32
Q

Why are probability mass functions called so?

A

Because they give the probability (mass) assigned to each of the finite or countably finite possible values for these DRVs

33
Q

What are distribution functions for DRVs always?

A

step functions

34
Q

What is a cumulative distribution function?

A

for Y:

F(y), such that F(y)=P(Y<=y) for all y

35
Q

What defines a continuous distribution function?

A

A RV Y is said to be continuous if F(y) is continuous for all of y

36
Q

What is the P(Y=0) for a continuous function?

A

0

37
Q

If F(y) is the distribution function for a CRV Y, what is f(y)?

A

The probability density function of Y (see graph L8)

38
Q

See

A

L8 properties of a density function and bit below

39
Q

How do you define the distribution of a RV Y?

A

A RV Y is said to have a cont./discrete x distribution on y interval iff the density function is:
z

40
Q

When might we use a gamma probability distribution and why?

A

If the RV is nonnegative tf produce positive distributions skewed to the right (draw diagram)
eg. wage data, size of firms etc.

41
Q

See

A

Gamma notes L9

42
Q

See

A

L10 on chi square and exponential distributions

43
Q

What do multivariate probability distributions allow us to do?

A

Find out information on the intersection of events

44
Q

For any RVs Y1 and Y2, the joint (bivariate) distribution is…

A

F(y1,y2) = P(Y1<=y1, Y2<=y2) (see L10 notes on this, all of it!!!)

45
Q

Whene are 2 RVs said to be jointly continuous?

A

If their joint distribution F(y1,y1) is continuous in both arguments

46
Q

What is R when doing double integrals, and what does the double integral give?

A

R=region of integration

It gives the volume under the surface z=f(x,y)

47
Q

3 steps to working out the double integral?

A

1) Work out the limits of integration
2) Work out the inner integral
3) Work out the outer integral

48
Q

See

A

Note 1 and 2 L11

49
Q

Define marginal probability functions for discrete RVs?

A

If Y1 and Y2 are jointly discrete RVs with probability function p(y1,y2), then the marginal probability functions of Y1 and Y2 respectively are given by:

p1(y1) = (all y2)Σp(y1,y2)

p2(y2) = (all y1)Σp(y1,y2)

50
Q

Define marginal probability functions for continuous RVs?

A

If Y1 and Y2 are jointly continuous RVs with joint density function f(y1,y2), then the marginal density functions of Y1 and Y2 respectively are given by:

f1(y1) = (-∞->∞)∫f(y1,y2)dy2

f2(y2) = (-∞->∞)∫f(y1,y2)dy1

51
Q

See

A

Conditional distributions bottom of L11 (v important dont get it yet)

52
Q

See

A

top of L12 definition and both theorems

53
Q

See

A

the expected value of a RV (CRV and DRV)

54
Q

If Y1 and Y2 are RVs with means μ1 and μ2, the covariance of Y1 and Y2 is?

A

Cov(Y1,Y2) = E[(Y1-μ1)(Y2-μ2)] = E(Y1Y2)-E(Y1)E(Y2)

55
Q

What does a 0 covariance indicate?

A

No linear dependence between Y1 and Y2

NOTE: the opposite is not true; uncorrelated variables may not be independent

56
Q

Problem with covariance? Solution to this?

A

It isn’t an absolute measure of dependence because its value depends upon the scale of measurement.

Solution: standardize its value by using the correlation coefficient:

ρ=Cov(Y1,Y2)/(σ(Y1).σ(Y2))

57
Q

Given 2 IRVs: Cov(Y1,Y2)=0?

A

Tf IRVs must also be uncorrelated

58
Q

See bottom of L12 side 2

A

now

59
Q

What is the conditional expectation of Y1, given Y2=y2?

A

The weighted average of the values that Y1 can take on, where each possible value is weighted by its conditional probability
SEE conditional expectations equations top of L13

60
Q

Law of iterated expectations?

A

E(Y1)=E[E(Y1|Y2)]

61
Q

What are mean independent variables?

A

Variables that are uncorrelated but may not be independent

eg. if E(Y1|Y1)=E(Y1)

62
Q

Note

A

Independence means mean independence means uncorrelated BUT not other way round

63
Q

See notes

A

Probability distribution function of an RV (don’t get)

64
Q

What is a statistic?

A

A function of the observable RVs in a sample and known observations

65
Q

Make cards on L14

A

now

66
Q

What is an estimator?

A

A rule, often expressed as a formula, that tells how to calculate the value of an estimate based on the measurement in a sample

67
Q

Difference between an estimator and a statistic?

A

An estimator is a statistic related to a particular parameter