Lecture 13 Flashcards

1
Q

What if a marginal distribution isn’t Gaussian ?

A

Impossible define joint distribution:

  • Case when 2 variables = different marginal distribution
  • Case for large number of marginal distribution → multivariate does not exist

→ use copula models

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are copulas ?

A

Relate two marginal distributions instead 2 series directly:

  • Able to relate any kind of margin
  • Possible to generate non-linear dependence

→ Pearson’s correlation not appropriate measure of dependence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the base of a copula ?

A
  • 2 rv X and Y with marginal distributions Fx = Pr[X ≤ x] and Gy = Pr[Y ≤ y]
  • Cdfs continuous
  • Joint distribution H(x,y)= Pr[X ≤ x, Y ≤ y]
  • All function’s range = [0,1]

→ H might not exist

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the definition of a bivariate copula ?

A

Function C : [0,1] x [0,1] → [0,1]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are the properties of a bivariate copula ?

A

• C(u,v) increases in u and v : if one marginal = cst, the other one increases

• C(u,0) = 0, C(u,1) = u , C(0,v) = 0, C(1,v) = v
o If one probability = 0 then joint also
o If one probability = 1 then joint determined by remaining one

• Pr[u1 ≤ U ≤ u2, v1 ≤ V ≤ v2] = C(u2,v2) – C(u2,v1) – C(u1,v2) + C(u1,v1) ≥ 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is copula’s theorem ?

A

H = joint distribution of C and Y with marginal distribution F, G then :

• Exist copula C s.t. H(x,y) = C [F(x), G(y)]

→ if F and G continuous, C = unique

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are the various measures of dependence and concordance ?

A
  • Dependence = strength of relation between 2 variables
  • Association = positive or negative relation
  • Concordance = association where small values of one imply small values of the other and same for big values → X1 < X2 → Y1 < Y2 = (X1 -X2) (Y1 -Y2) > 0
  • Discordant = inverse of concordant → (X1 -X2) (Y1 -Y2) < 0
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the measure of concordance between rv X and Y ? What are its properties ?

A

κ(x,y)

  • Defined for every pair of rv = completeness
  • Normalized measure -1≤ κ ≤1 and κ(x,x) =1 and κ(x,-x) = -1
  • Symmetric : κ(y,x) = κ(x,y)
  • If X and Y are independent → κ(x,y) =0
  • κ(x,-y) = κ(-x,y) = - κ(x,y)

→ measure of concordance = invariant w.r. to linear increasing transformations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Is Pearson’s correlation a measure of concordance ?

A

Only under normality, and is a measure of association

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are Kendall’s Tau and Spearman’s rho ?

A

Measures of concordance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does it mean when two series are comonotonic ?

A

κ(x,y) = 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What does it mean when two series are counter-monotonic ?

A

κ(x,y) = -1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is Kendall’s tau for 2 rv ?

A

Probability of concordance - probability of discordance of two independent pairs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is Spearman’s rho ?

A

multiple of probability of concordance - probability of discordance of 2 independent pairs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What happens if X2 and Y3 are independent in Spearman’s rho ?

A

ρs = distance between joint distribution of (X,Y) and independence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How can Spearman’s rho also be viewed ?

A

Pearson’s correlation between F and G or ranks of X and Y

17
Q

What is the Pearson’s correlation ?

A

Natural scalar of linear dependence in elliptical distributions → misleading measure of dependence in more general situation

18
Q

What are Pearson’s correlation’s properties ?

A

• ρ[X,Y] = invariant under linear transformations only

• ρ[X,Y] = bounded: -1 ≤ ρL ≤ ρ[X,Y] ≤ ρU ≤ 1
o ρU = comonotonic and ρL = counter-monotonic

  • ρ[X,Y] for comonotonic (counter-monotonic) can be different from 1 (-1)
  • ρ[X,Y] = 0 does not imply independence between X and Y
19
Q

What is the first approach to modeling of non-linear dependence ?

A

estimating unrestricted joint density non-parametrically

→ deduce non-parametric estimate of associated unrestricted copula

20
Q

What are the advantages and disadvantages of empirical copulas ?

A

• Advantage
o Not require any additional assumption on non-linear-dependence

• Drawback
o Complicated interpretations of patterns of non-linear dependence
o Likely to provide inaccurate and erratic results

21
Q

What are the special cases of Elliptical copula

A
  • Normal distribution
  • T distribution
  • Cauchy distribution
  • Laplace distribution
  • Uniform distribution
22
Q

What is the condition for an Elliptical copula ?

A
  • Random vector X ϵ R^n has multivariate elliptical distribution if density∶f(x)=|Σ|^(-1/2) g[(X-μ) Σ^(-1) (X-μ)] for some g∶ R→R+where Σ=PD
  • Contours of equal density form ellipsoids in R^n
23
Q

What are Archimedean Copulas ?

A

Copulas that are not derived from multivariate distribution functions

24
Q

What is the Archimedean’s theorem ?

A

φ = continuous, strictly decreasing function from [0,1] to [0,∞) s.t. φ(1) =0 and φ^(-1) = inverse of φ. Function fro [0,1]^2 to [0,1] : C(u,v) = φ^(-1) [φ(u) + φ(v)] = copula only if φ = convex

25
Q

What are Archimedean copulas’ advantage ?

A

Most of them have closed expressions

26
Q

What is φ for Archimedean Copulas ?

A

Generator of copula.

27
Q

What is C’s properties under archimedean copulas ?

A
  • Symmetric: C(u,v) = C(v,u)
  • Associative
  • τ(C)=1+4∫φ(u)/φ(v) du
28
Q

What are Clayton copula’s properties ?

A
  • φ(t)=(t^(-θ)-1)/θ for θϵ(0,∞)→C(u,v)=(u^(-θ)+v^(-θ)-1)^(-1/θ)
  • Density of copula∶C(u,v)=(1+θ) (uv)^(-θ-1) (u^(-θ)+v^(-θ)-1)^(-2-(1/θ) )
  • τ(C)=θ/(θ+2)
29
Q

What is on drawback of Clayton copula and the solution ?

A

Dependence only in lower tail → need rotated copula

30
Q

What is Clayton’s rotated copula )

A

Cr = (u.v) = u+v-1+C(1-u,1-v)

31
Q

What are Gumbel copula’s properties ?

A
  • When φ(t)=[-logt]^θ for θ∈[1;∞)
  • C(u,v) = exp{-[(-logu)^θ + (-logv)^θ]^(1/θ)}
  • τ(C)=1-(1/θ)
  • Rotated Gumbel same as rotated Clayton but has dependence in lower tail
32
Q

What are the various approaches to estimate the parameters of copula ?

A
  • Standard ML estimation
  • 2 steps estimation
  • Semi-parametrically estimation
  • Method of moments
33
Q

Why is MLE difficult to implement in practical application for copulas ?

A
  • Dimension of optimization can be very large

* No analytical expression of gradient of likelihood

34
Q

What is the two-steps estimation procedure for copulas ?

A

Separation of vector of parameters into different parts → margin and copula

  • Step 1 = estimation of margins
  • Step 2 = estimation conditionally θγ of copula

• If model correctly specified, estimator consistent and asymptotically normal
o √T (θIFM-θo)~N(0,Ωo^((-1) ))
o Ωo=Ao^((-1) ) BoAo^(-1)

35
Q

What are the semi-parametric ML’s properties ?

A
  • Avoids specifying margins → use marginal empirical cdf
  • Obtain estimator of copula by maximizing pseudo-likelihood
  • θγ = asymptotically normal with larger asymptotic variance than MLE (obtained assuming margins are known)
36
Q

What is convenient with the Method of moments while estimating copula ?

A
  • Simpler estimator of copula’s parameter

* Equalize theoretical quantiles with empirical