quantifying volatility in VAR model Flashcards

1
Q

what is the VAR based on?

A

The underlying variable’s probability distribution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the characteristic of fat tail?

A

more probability mass around the mean and at the tail less probability mass around the +1/-1 SD so the probability mass is distributed from the +1/-1 SD to the mean and the tail in order to remain the same mean and SD as normal dis does the distribution however has similar mean and SD as the normal dis does

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Reasons for Fat tail

A

the normal dis assume unconditional dis, the dis is independent for the market condition and new information but dis is actually conditionally distribute. if the distribution is conditionally dis, then the first two moments, mean and SD, is also conditional so the reason for fat tail is: 1. mean is conditionally distribute, ie time varying 2. SD is conditionally distribute, ie time varying

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

does fat tail caused by time varying mean?

A

unlikely, because market is efficient enough to absorb all the new information, so time varying mean is unlikely.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

does fat tail caused by time varying volatility?

A

likely, volatility is time varying. uncertainty in different point of time(major central bank decision) is different

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what is regime-switching volatility model?

A

assume different market regime exist with high or low volatility but never in between in different time period. the conditional dis is always normal but with time varying volatility

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Implication from the regime switching volatility model

A

under unconditional dis, extreme event is unlikely because it based on constant volatility if the dis is conditionally dis with different volatility, then unconditional dis under estimate the probability of extreme event because the probability of extreme move in the conditional dis with high volatility is higher than the probability in the unconditional dis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

method for VAR

A

(1)historical based approach: parametric approach, non-parametric approach and hybrid approach (2)implied volatility approach parametric approach: assumption of the underlying distribution ie if the distribution follows random walk, the future variance equals to :….(so the parameter of the distribution has to be define) non-parametric approach: no assumption of the underlying distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

parametric approach: determined the volatility (exponential smoothing)

A

The weight on the past data is decline exponentially, with the closet highest.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

how to handle the residual weight in the exponential smoothing method

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

the difference between the exponential weight method and the standardized method?

A

exponential weight place more weight on the recent observation, while the standardized method place equal weight

if the sample size is small, the weight of the standardized method will increase. so the extreme observation will have greater impact, regardless where the observation lies in

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Adaptive volatility estimation and the meaning of the weight

A

Higher weight means our belief will not changed dramatically from the last period’s volatility even new information arrived.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

why GARCH is in general better than the exponential smoothing method?

A

Because GARCH is more general and less restrict

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

what are the three common non-parametric method to estimate VAR?

A
  1. historical simulation
  2. multivariate density estimation
  3. hybrid
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

advantage of non-parametric method over the parametric method for estimate VAR

A
  1. it does not require assumption of the underlying distribution
  2. therefore the fat tail, skewness will not be a problem in the estimation process( since the underlying distribution is the true distribution)
  3. MDE approach allow weight to vary based on how relevant the data is to the current market environment, regardless of the timing of the data
  4. Hybrid approach does not require distribution assumptions because it uses a historical
    simulation approach with an exponential weighting scheme
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Disadvantage of the non-parametric method

A
17
Q

Historical simulation example

6 lowest return of the window of 100 days is provided, each return equally weight at 1/100

A
18
Q

Procedure for the Hybrid approach

A
19
Q
A
20
Q
A
21
Q
A
22
Q

what is multivariate density estimation?

A

it is a method use to estimate the joint probability density function of a set of variables

23
Q

what is return aggregation(?)

A

the weight of the asset in the portfolio is calculated by today’s weight, regardless of the time (so the weight of an asset in the portfolio k days ago is determined by the weight today)

Some markets are highly correlated in the down market. Thus the variance matrix calculate from the down market could underestimate the true risk in the normal market. The benefit of diversification could disappear

the benefit of return aggregation is that no estimation is required (don’t need to estimate the portfolio’s variance and correlation as the covariance metric did), therefore avoid the problem of estimation error and the high correlation in the down market

A third approach to calculating VaR estimates the volatility of the vector of aggregated returns and assumes normality based on the strong law of large numbers.

24
Q

the adv and disadv of using the implied volatility

A

A big advantage of implied volatility is the forward-looking predictive nature of the model.
Forecast models based on historical data require time to adjust to market events. The implied volatility model reacts immediately to changing market conditions.

The major disadv is the implied volatility is model dependent. the BS model is based on the assumption that asset return is log-normally distribute, and the volatility is remained constant during the option contract period.

empirical evidence suggest implied volatility over-estimate the true volatility, and the implied volatility only available to the asset class that has option traded in the market

25
Q

what is the long-horizon volatility? what are its assumption

A

A1 assume the underlying series is random walk. so the knowledge of today’s return has no effect on tomorrow’s return

A2: constant volatility

26
Q

what is mean reversion? How does it affect the long-run volatility?

A

A stationary process with long run mean, and the series tend to revert to the long run mean level

if that is the case, then the autocorrelation in the series(the correlation with the lag variable) is no longer zero (because after a period of raise it will then down). The serial covariance is negative when mean reverting exist, therefore the zero serial covariance assumption (A1) will over-estimate the true volatility level.

so if mean reverting exist, the long run volatility is lower than the square root of the volatility

27
Q

how does the time varying volatility affect the long-horizon volatility?

A

volatility is stochastic and has a long run mean (a steady state of uncertainty)

if today’s volatility is above the long run mean, then decline is likely, so the autocorrelation is expected to be negative, therefore the zero serial covariance assumption will overestimate the true volatility (the squareroot approach will overestimate)

if today’s volatility is lower than the long run mean, then increase of volatility is likely, the autocorrelation is expected to be positive and therefore the zero serial covariance assumption will underestimate the true volatility (the squareroot approach will underestimate)

whether over or under estimate depends on the relative position of today’s volatility and its long run mean

28
Q

formalized the mean reversion process

A
29
Q

formalized why square root variance will over-estimate the risk when mean reverting exist

A

The square root variance method is simply the square root of the time (t) times the volatility today to calculate the t ahead volatility

30
Q

what is the Back testing VAR method

A

Backtesting is the process of comparing losses predicted by the value at risk (VaR) model to those actually experienced over the sample testing period.

If a model were completely accurate, we would expect VaR to be exceeded (this is called an exception) with the same frequency predicted by the confidence level used in the VaR model.

for example, if the one-day VAR is 50,000 at 95% level, then it is expected to have observed loss over 50,000 in 13 days out of 250 trading days (250*5%)

31
Q

how to evaluate the effectiveness of VAR

A
  1. VAR should be unbias

To test this property, we use an indicator variable to record the number of times an exception occurs during a sample period. For each sample return, this indicator variable is recorded as 1 for an exception or 0 for a non-exception. The average of all indicator
variables over the sample period should equal x% (i.e., the significance level) for the VaR estimate to be unbiased

32
Q

how to evaluate the effectiveness of VAR

A
  1. VAR should be adaptable

if a large return increases the size of the tail of the return distribution, the VaR amount should also be increased. Given a large loss amount, VaR must be adjusted so that the probability of the next large loss amount again equals x%. (The loss amount needs to adjust to reflect the same maintain the same confident level) This suggests that the indicator variables, discussed previously, should be independent of each other. It is necessary that the VaR estimate account for new information in the face of increasing volatility.

33
Q

how to evaluate the effectiveness of VAR

A
  1. VAR should be robust

A strong VaR estimate produces only a small deviation between
the number of expected exceptions during the sample period and the actual number of exceptions.

This attribute is measured by examining the statistical significance of the autocorrelation of extreme events over the backtesting period.

A statistically significant autocorrelation would indicate a less reliable VaR measure.

34
Q

The effectiveness of different method for VaR

A

non-parametric method is better than the parametric method because the fat tail event is more easily incorporate in the historical simulation method.

higher weight in the exponential weight parameter do a better job than the lower weight

Hybrid approach tends to reject the null hypothesis that zero autocorrelation in the tail event fewer times than the exponential smoothing approach when autocorrelation in the tail event exist.

35
Q
A