Bahnemann Flashcards

1
Q

Methods for Estimating Distribution Parameters

A

method of moments

maximum likelihood

minimum chi squared

minimum distance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

truncation

A

discarding; usually in case of claims below a deductible

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

censoring

A

capping; usually in case of limit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

shifting

A

usually with straight deductible; for claims larger than deductible, they get reduced by the deductible amount

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

in case of deductibles

A

usually l is reduced by a so layer of interest becomes (a,l]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

in case of underlying limits

A

l is not reduced by a so layer of interest becomes (a,a+l]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

since limits reduce volatility of severity compared to unlimited data

A
  • may be interesting in computing the variability of losses in layer
  • can use the coefficient of variation as a way to measure the variability for different distributions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

claim contagion parameter

A

accounts for claim counts not being independent of each other (where 1 claim encourages others to file a claim too)

-if claim counts have a Poisson, γ=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

final rate for a policy needs to incorporate

A

all other expenses and profit as well as charge for risk

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

risk charge

A

premium amount used to cover contingencies such as:

  1. random deviations of losses from expected values (process risk)
  2. uncertainty in selection of parameters describing the loss process (parameter risk)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Loss cost multiplier

A
  • LCM loads all costs on top of PP to get final rate
  • commonly used in practice for WC to load an insurer’s own expenses and profit on top of PP
  • published by bureau such as NCCI
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

for purpose of pricing, instead of publishing full rates for every limit

A

insurers usually use relativities called ILFs to rate for a basic limit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

ILFs can be determined using

A

empirical data directly or can be obtained using a theoretical curve fit to empirical data with latter approach being more common for highest limits with little empirical loss data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

in determining ILFs appropriate for each limit, following assumptions are commonly made:

A
  1. all UW expenses and profit are variable and don’t vary by limit
    - in practice, profit loads might be higher for higher limits since they are more volatile
  2. frequency and severity are independent
  3. frequency is same for all limits
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

ILFs must be (with Bahnemann’s assumption of fx(l) is not equal to 0 )

A

increasing at a decreasing rate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

in terms of premium, premium for successive layers of coverage of constant width

A

will be decreasing

17
Q

checking that a set of ILFs satisfies above criteria for I’(l) and I’’(l)

A

performing a consistency test:

Per occurrence limit l

Increased limit factor I(l)

Marginal rate per $1k coverage I’()

18
Q

an exception to consistency test

A

could occur if one of the ILF assumptions was violate like if liability lawsuits were influenced by size of limit, then frequency would not be same for all limits so formulas would not hold

19
Q

there is more volatility (process risk) for policies with

A

with higher limits or higher attachment points, insurers will also want to charge a risk load for these policies

-to do this, need to include a risk charge ρ(l)

20
Q

risk charge ρ(l) options

A

old Miccolis aka variance method

old ISO aka std dev method

21
Q

when pricing a layer of coverage and applying a risk load, it is not appropriate to

A

simply subtract risk loaded ILFs or premiums for layer since that would result in an incorrect risk load

while without risk loading we have

Pa,l=Pl-Pa=Pb*[I(a+l)-I(a)]

-these relationships do not hold when risk loading is applied

22
Q

deductibles typically reduce

A

coverage limit so layer of coverage with deductible d and limit l is (d,l] and not (d,d+l]

23
Q

3 types of deductibles:

A

straight

franchise

diminishing

24
Q

straight deductible

A

loss is truncated and shifted by d such that net losses

Xd = X-d for d<x>
</x>

25
Q

franchise deductible

A

loss is truncated but not shifted by d such that

Xd=X for d<x>
</x>

26
Q

diminishing deductible

A
  • aka disappearing deductible
  • loss below amount d is fully eliminated, deductible declines linearly from d to another larger D and loss above D is paid in full

Xd,D = D/(D-d) * (X-d) for d <x>&lt;D</x>

= X for D<x></x>

<p>-formula for LER even assuming alae is not additive is long, so calculate the loss eliminated at each size of loss level as a % of total losses</p>

</x>

27
Q

when comparing the 3 deductibles

A

straight eliminates the most loss, then diminishing and then franchise

28
Q

Assuming retention R describe how inflation will affect the expected losses for the excess cover relative to unlimited ground up losses.

A

Inflation would affect excess losses more than total losses. Some losses that were below the retention will now get into the excess layer. Also, for losses that were already in the excess layer, the increase in losses due to inflation will affect the excess layer entirely.

29
Q

reason why it is desirable for a set of increased limits factors to pass this consistency test.

A

So long as partial losses are possible, expected losses will not increase as much as the increase in limits, so the rate per $1k of coverage should decrease as the limit increases.

30
Q

one reason that a set of increased limits factors may fail this consistency test yet still generate actuarially reasonable prices.

A

Adverse selection, which could happen if insureds that expect higher loss potential are more inclined to buy higher limits.

Adverse selection, which could happen if liability lawsuits are influenced by the size of the limit.

31
Q

how the consistency test has both a mathematical interpretation and a practical meaning

A

The practical interpretation is that as the limit increases, there are less losses expected at higher layers, so rates should not increase more for higher limits than for lower limits.

mathematical interpretation is that I’(l) ≥ 0 and I’‘(l) ≤ 0

32
Q

diagram for expected losses

A

y=loss size

x = FX(x) = cumulative loss distribution

33
Q

why the loss cost for a given straight deductible policy can increase more than the ground-up severity trend.

A

For losses above the deductible, the trend is entirely in the excess layer.

Also, losses just under the deductible are pushed into the excess layer by the trend, creating new losses for the excess layer.

34
Q

risk load increases as

A

policy limit increases and is used to take into account the higher process risk for policies with higher limits