special classifications Flashcards
what is the RV that is considered one of main factors in freq and severity of claims
-almost every rating algorithm has some RV that reflects geographic location of risk
Few challenges when determining indicated rates for territories
- territory tends to be high correlated with other RVs
- territories are often set to be such small areas that data in each territory may have very limited credibility
2 steps in territorial RM
- establishing territorial bounds
- determining indicated rates for each territory
Establishing territorial boundaries
- define basic geographic unit
- estimate geographic systemic risk for each geographic unit and to distinguish it from both random noise and systematic risk for other correlated non-geo RVs
GLMs can incorporate
- GLMs can incorporate geo-physical variables (rainfall) and geo-demographic variables (population density)
- still some unexplained geographic variance -> new variable to account for residual variation
spatial smoothing techniques
can be applied to residual variable to smooth results
- distance-based: credibility weighted with data from other units with weights diminishing with distance
- adjacency-based: credibility weight with data from rings of surrounding units with weights diminishing with wider rings
distance based
- easy to understand and implement but assumes distance has same impact for urban and rural risks and doesn’t consider physical boundaries
- best for weather-related perils
adjacency based
- better reflects urban and rural differences and accounts for physical boundaries better
- best for socio-demographic perils (theft)
clustering routines
-once indicated relativities are determined at basic unit level, these can be grouped into territories if desired by clustering routine
Quantile methods
Similarity methods
determining correct relativities aka ILFs for other limits has become more important over time
As personal wealth grows, people need more coverage
Inflationary trend have more impact on increased limits
More lawsuits and higher jury awards over time
why are standard RM approaches are problematic for ILFs
Generally less data at higher limits so results can be volatile
Analyses can produce results that are impractical to implement
standard ILF approach
-rates for various limits are expressed as relativities/ILFs to rate for basic limit:
rate @ limit H = ILF(H) * rate @ limit B
common assumptions for ILF approach
All UW expenses and profit are variable and don’t vary by limit
Freq and severity are independent
Freq is the same for all limits
LAS(H)
limited average severity @ limit H=severity assuming every loss is capped at H
ILFs with censored losses
have to use LAS for layers of loss
LAS(H)=LAS(B)+LAS(H-B xs B)*prob(x>B)
why are deductibles popular
- reduce insured’s prem
- eliminate insurer from handling small nuisance claims
- provide incentive for insured to avoid or mitigate loss
- help reduce insures CAT exposure
Standard method of pricing deductibles
Loss Elimination Ratio approach
typical assumption of loss elimination approach
All UW expenses and profit are variable with prem
ind relativity for deductible pricing
=excess ratio(D)=1-LER(D) = 1 - losses below d/ground up
when you dont have ground up losses for deductible pricing
only policies with deductibles less than or equal to deductible you are pricing can be used to price the current deductible
= losses eliminated with new deductible/loss with old deductible
LER approach will not consider and other issues
that claimant behavior may vary by deductible
- favorable selection occurs if low-risk insureds can choose higher deductibles -> LER will not recognize this
- dollar savings implied by deductible relativity may not be appropriate -> prem savings > deductible difference
WC Size of Risk
- rating algorithms include components that recognize that FEs as % of prem should decrease as size of policy increases
- helps ensures small risks are not undercharged etc for FE
expense constant
flat $ amount added to prem for each risk
premium discount
applies % discount to larger policies to recognize that expenses are lower % of their premium
expense reduction = exp total (row1) - exp total
discount = reduction/(1-tax-profit)
Small WC insureds generally have worse loss experience
- small usually have less sophisticated safety programs
- small usually don’t have return-to-work programs for injured
- not impacted by or do not qualify for experience rating so less incentive to prevent or mitigate injuries
loss constant
- add flat $ amount to premium to equalize LRs between small and large risks
- loss constants can apply only to small risks or can be applied to all risks
insurance to value
ITV = ratio of coverage to replacement cost
- properties are typically insured for amount that it would take to replace property if damaged
- sometimes properties are not fully insured to full replacement cost
indicated rate per $1k of coverage
will be higher for underinsured homes as long as partial losses are possible
When properties are insured to less than full replacement, 2 issues:
- insured will not be fully covered in event of total or near-total loss
- if insurer assumes all homes are fully insured to their replacement cost when calc rates then prem charged for underinsured will not be adequate to cover expected losses for those policies -> when prem not equal to expected costs, rates are not equitable
coinsurance
- multiple parties each insure portion of risk
- alter amount of losses that are covered
- can be instituted when property is insured for less than coinsurance requirement set by insurer
coinsurance formulas
a=min[cov/RC*coins, 1]
I=min[a*L, cov]
e=min[loss,cov]-I
coinsurance graph
max penality = coverage amount
penalty of 0 = coinsurance requirement and beyond