Defintions Flashcards

1
Q

Statistic Def

A

Let J be an arbitrary set.
A function (T: Omega -> J ) independent of Theta is called a statistic.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Statistic Sufficiency

A

A statistic (T: Omega -> J) is sufficient for Theta if the partition
{ x e Omega : T(x) = a }, a e J

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Neymann Factorisation Theorem

A

A statistic T is sufficient for Theta, if and only if there exists g and h such that,

L(x|Theta) = g(Theta, T(x)) h(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Statistic T completeness

A

A statistic T is complete if its family of distributions is complete.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Exponential Family Completeness Theorem

A

X is an IID sample from a probability model

P(x|Theta) = exp[ThetaB(x) + C(Theta) + D(x)] , Theta in THETA

If THETA contains an open interval T = sigma B(x) is complete

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Bahadurs Theorem

A

If a statistic T, is sufficient and complete for Theta, it is also minimal sufficient. However, the converse is not true.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

MLE

A

The MLE Theta_hat(x) of Theta, is a value of Theta which maximises the likelihood w.r.t theta for fixed x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Small Volatility(MSE)

A

We want estimator to have small MSE

MSE = Var[g_hat(X)] + [b_g_hat(Theta)]^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Consistency of an estimator

A

Estimator g_hat(x) is consistent if,

G_hat(x) -> g(Theta) as n -> infinity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

MLE and sufficiency Def

A

If T(x) is sufficient in likelihood model, MLE is a function of the sufficient statistic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

MLE and CRLB Theorem

A

If theta_tilde, an unbiased estimator of Theta, attains the CRLM, the theta_tilde is the unique MLE of Theta

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Rao-Blackwell Theorem

A

Let g_hat(x) be an unbiased estimator of g_theta. If T = T(x) is sufficient for Theta then, the estimator g_tilde(t) is an unbiased estimator for g_theta.

G_tilde(t) = E[g_hat(x)|T(x) = t) and

Var(g_tilde(t)) <= Var(g_hat(x))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Lehman-Scheffe Theorem

A

Suppose there is an unbiased estimator g_hat(x) for g(theta) and a statistic T = T(x) that is sufficient and complete for Theta.
Then,

G_tilde(t) = E[g_hat(x)|T(x) = t] is the unique MVUE of g(theta)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Type 1 and Type 2 errors, size and power

A

T1 - Reject H0 when H0 is true
T2 - Do not reject H0 when H0 is false

Size - prob(Type 1 Error)
Power - 1 - (Type 2 Error) i.e Reject H0 when false

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

(UMP test)

A

A test of size alpha is UMP if,

For all theta in THETA1, its power is greater than or equal to the power of any other test of size alpha

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

UMP Lemma

A

There exists a UMP test of size alpha for H0 vs H1 iff there is a test phi that is the MP test of H0 vs H1’: theta = theta1 for all theta in THETA1

17
Q

3 Examples of Non-Informative Priors and explain them

A

Ignorant: All values of theta are equally likely

Vague: Choose a prior with a very flat curve e.g gamma then let a,b -> 0.

Jeffrey’s Prior:

Pi(theta) is proportional to the sqrt of Fisher Info

18
Q

Sufficiency Theorem for Statistic in Posterior

A

If T(X) is sufficient for theta, then

Pi(Theta|X) = Pi(Theta|T(X) = t)

19
Q

Conjugate Family Def

A

A class C of distributions is said to from a conjugate family for the likelihood of the posterior distribution is in C whenever the prior distribution is.

20
Q

Bayesian Credible Region

A

The region C(alpha) is a 100(1-alpha)% BCR for theta if

P(Theta e C(alpha)|x) = 1- alpha

21
Q

Highest Posterior Density Region Def

A

The region C(alpha) is a HPDR for theta if,

C(alpha) = {Theta: Pi(Theta|x) >= gamma), where gamma is chosen so that

P(Theta e C(alpha)|x) = 1-alpha

22
Q

HPDR in practice

A
  • Posterior pdf is 0 outside interval (x,y)
  • Positive/negative on the interval (x,y)
  • Increasing/ Decreasing (f’(x)>0 etc) on interval

Increasing -> (a,y)
Decreasing -> (x,a)

Then integrate the posterior pdf on this interval

23
Q

Conjugate prior for sigma known, mu unknown

A

The conjugate prior for mu is N(m,w). And mu is proportional to the N(m, Vsigma^2).

24
Q

Conjugate prior mu known, signal unknown

A

When mu is known , the conjugate prior for sigma^2 is the Inverse Gamma(a,d).

Proportional to the IG([a+(mu-m)^2]/v, d + 1) distribution

25
Q

Conjugate prior for general case when sigma and mu are both unknown

A

The Normal Inverse Gamma distribution.

NIG(a,d,m,v)

Mu and sigma sq are not independent.