W&M Ch 11 Flashcards
Challenges to territorial ratemaking
- Tends to be heavily correlated with other rating variables -e.g. high value homes often located together -makes traditional univariate analysis very susceptible to distortions 2. Often analyze territory as collection of small units -data in each individual territory is sparse
Territorial ratemaking generally involves two phases
- Establishing territorial boundaries 2. Determining rate relativities for the territories
Two major issues calculating the Geographic Estimator with univariate techniques
- Sparse data results in volatile experience 2. Location tends to be highly correlated with other non-geographic factors
What are two Spatial smoothing techniques used to improve estimate of unit by using info from nearby units?
- Distance-based approach -Give weight to nearby geographic units based on distance from primary unit -Assumption tends to be most appropriate for weather-related perils 2. Adjacency-based -Weight given to rings of adjacent units –immediately adjacent units get most weight -Tends to be most appropriate for perils driven by socio-demographic characteristics (e.g. theft)
Basic Clustering Routines
-Quantile methods –Use equal number of observations -Similarity methods –Based on how close estimators are -Note - do not naturally produce contiguous groupings –Need to add contiguity constraint if that is desired
Importance of ILFS growing for several reasons
- Personal wealth continues to grow 2. Economic Inflation drives up costs 3. Social inflation
Why would you choose to vary the profit provision by limit
- Experience in higher limits can be volatile 2. Less frequent, less severe 3. Greater variability adds uncertainty so more risky and challenging to price 4. May alter profit provision to reflect higher cost of capital needed to support additional risk
Give an example why frequency may vary by limit chosen
E.g. Personal Auto - person who chooses high limit tends to have lower frequency -may be due to fact that individual choosing higher limit may be more risk averse
Additional Considerations performing ILF Ratemaking
- Historical losses should be adjusted for expected trend 2. Depending on age of data, claims may not be settled -Ideally all claims should be developed to ultimate 3. Losses may be censored from below if policy has a deductible -Can add back deductible -May not be possible to know how many claims were completely eliminated due to deductible
Fitted Data Approach
Fit curves to empirical data
- Smooth out random fluctuations
- Common distributions include lognormal, Pareto, and the truncated Pareto
Multivariate Approach to ILFs
- GLMs can deal more effectively with sparse data 2. Major difference between GLM and univairiate approaches using LASs -GLM does not assume frequency is same for all risks
Two Basic Types of Deductibles
- Flat dollar deductible specifies a dollar amount below which losses are not covered by policy 2. Percentage deductibles are stated as a percentage of coverage amount
Some reasons deductibles are used
- Premium reduction 2. Eliminates small nuisance claims 3. Provides incentive for loss control 4. Controls catastrophic exposure
Other considerations to deductible ratemaking
- Censored Data -Ground-up losses may not be known due to fact that losses below deductible are often not reported -Cannot use data for policies with higher deductibles to price lower deductibles 2. Trend and development -Should be trended and developed
Fitted Data Approach to deductible pricing
LER Can be calculated given a continuous distribution of losses