Past papers 2016 Flashcards
Explain the effect that inadequate data could have on the insurer in terms of pricing and business acquisition.
When pricing, it is important that we monitor the progress of existing experience as it develops, in order to assess the need for a review. Thus one effect of inadequate data is that we might make a WRONG DECISION on the need for a review and rate to be charged.
When we carry out the actual projections of the new rating requirements, inadequate data may distort the calculations. This may be due to errors in:
- the apparent SIZE OF THE BUSINESS IN FORCE, and its value expressed in exposure units and premium
- the apparent CLAIMS EXPERIENCE and its trends, on which the projected future costs are being based.
Moreover, the errors may distort the true distribution of the business between risk groups. This could have consequences if we decided to adopt a differential rating increase for each risk group. It could also affect the marketing strategy if certain risk groups appeared to be more attractive risks than they actually are.
If we adopt a deficient set of rates as a result of faulty data, the insurer might:
- suffer UNDERWRITING LOSSES if rates are too low
- suffer LOSS OF MARKET SHARE if rates are too high
- attract UNDESIRABLE RISKS, causing deterioration in underwriting experience if rates for such risks are too low.
Outline the measures a company could take to mitigate the effects of using inadequate data or poor quality data
- Take a PRUDENT VIEW of future experience and reflect this in the pricing structure.
- Consider writing only a SUBSET of risks/perils until actual experience becomes available.
- Consider only accepting liability covers with low limits and exposures until actual experience becomes available.
- Use more REINSURANCE, reducing the retention to reduce risk.
- Examine the SENSITIVITY OF PRICING MODELS to assumptions, particularly looking at whether it drives a decision on whether to write business or not.
- Carry out a “WHAT IF” ANALYSIS of a draft rating structure and set of decline rules to see what business would be won at what price comparing output from pricing models to the price charged by other insurers in the market.
- Put in place MONITORING OF KEY STATISTICS, such as volumes, premiums, mix of business and cause of claims to spot possible problems early.
- Ensure the company can change rates quickly with renewals every 6 months instead of annual renewals.
- Consider not selling insurance policies until better quality data is available to price.
Economic capital
Economic capital is the value of the reserves that a company determines as appropriate given its assets, liabilities and business objectives.
It takes into account the riskiness of the individual assets, liabilities, correlations between these risks and the overall level of credit deterioration the company wishes to be able to withstand.
XL Treaties
Protect the insurer against large individual accident losses as well as catastrophes that may lead to an accumulation of losses from a single event.
What will the actuary need to do in order to understand the impact of an increase in the XL retention limit on the capital requirements?
The actuary will need to investigate the additional volatility caused by the proposed increased retention.
This can be done by:
- Analysing the VOLATILITY IN FREQUENCY AND SEVERITY of large individual historic losses suffered by the insurer. This can be done using several years’ past data. Historic losses will be inflated and projected to ultimate.
- Analysing the FREQUENCY AND SEVERITY distributions of accumulation type events, including natural catastrophes such as hail and storm.
3 Types of investigation that an actuarial team could undertake to assess the company in formulating an optimal reinsurance structure
- Expected impact on profit, e.g. reinsurance premium payable vs expected recoveries.
- Analysis of alternate structures for reinsurance
- Analysis of reinsurance commission structures, overriders and profit commission.
Perils covered by construction and engineering policies
- Damage to the project
- Destruction
- Design defects
- Discovery of construction faults
- Faulty parts
- Failure to finish the project, or finish on time
Moral hazard
Moral hazard is the risk that an insured may behave in a less risk-averse manner when they are insured.
define a “working layer”
Excess of loss reinsurance indemnifies the cedant for the amount of a loss above a stated excess point, usually up to an upper limit.
A WORKING LAYER is a layer of excess of loss reinsurance where the excess point is at a low enough level for it to be likely to experience a fairly regular flow of claims.
type(s) of reinsurance commission which may be paid to a direct writer in respect of Excess of Loss reinsurance
PROFIT COMMISSION is the only type of reinsurance commission likely to be payable (if any).
This is commission which is dependent upon the profitability or claims experience of the business cede during each accounting period.
Profit commission may be payable for a working layer because the
- EXPERIENCE in a working layer is LESS RANDOM than for higher layers and is
- more LIKELY TO BE REPRESENTATIVE of the underlying risk.
Define stability clause
A clause that may be included in a non-proportional reinsurance treaty,
providing for the INDEXATION OF MONETARY LIMITS (that is, the excess point and/or the upper limit) in line with a specified index of inflation.
It is designed to MAINTAIN THE REAL MONETARY VALUE of the excess point and the upper limit under non-proportional reinsurance.
The impact of a stability clause
Depends on the cedant’s actual claims experience and on the inflation in claims relative to the specified index and also the excess point and upper limit.
If there was not upper limit and actual claims inflation was lower than the index inflation, then the frequency of losses to the layer would drop over time.
If claims inflation is the same as that used on the specified index, then the expected real cost of claims will remain constant.
Define deductible
A deductable is the amount which, in accordance with the terms of the policy, is deducted from the claim amount that would otherwise have been payable and will therefore be borne by the cedant.
Define aggregate deductible
The maximum amount that the cedant can retain within its deductible when all losses are aggregated.
Introdution of an aggregate deductible means that the sum of the claims to the layer must exceed the deductible before the cedant can make a recovery.
So, for a given amount of exposure, we expect the aggregate deductible to reduce the cedant’s expected recovery and increase the cedant’s retention.
Define ranking deductible
Applies to each individual loss.
Ranking deductibles contribute towards an insured’s aggregate deductible. Non-ranking deductibles and trailing deductibles do not contribute to the aggregate deductible.
Per-occurrence limit
The maximum amount that the insurer can retain for each individual loss.
Introduction of a per occurrence limit means there is a maximum potential loss on each claim and hence adding a per occurrence limit would reduce the expected cost of claims.
4 Steps to estimate the next 3 years’ reinsurance risk premium under different reinsurance structures using a frequency-severity simulation model.
- Determine the claim frequency and severity distributions
- Determine exposures for the next 3 years
- Simulation for a specific reinsurance structure.
- Repeat step 3 for different reinsurance structures
Estimate the next 3 years’ reinsurance risk premium under different reinsurance structures using a frequency-severity simulation model.
Step 1: Determine the claim frequency and severity distributions
Use the companies claim database, check data for completeness and correct any obvious data anomalies.
Pick a base period to use, e.g. last 5 years.
If there are any policy limits on the claims, an estimate of the number (an amount for each claim) needs to be estimated for those below the insured’s retention and those above the policy limit.
Use standard reserving techniques (chain ladder / BF) to calculate the number of IBNR claims and their cost. For a frequency-severity approach, we need to know the individual claim sizes and period in which the claim occurred. It is important to apply a development pattern that is appropriate for the losses being developed.
All claims from past years (including IBNR) need to be developed to ultimate and treated “as-if” they occurred in the following period. Consider appropriate assumptions to adjust for claims inflation, changes in policy wording / risks covered.
A decision regarding large and catastrophe claims needs to be made. Large claims and catastrophe claims are normally modelled separately.
Fit frequency and severity distributions to the losses, e.g.:
- FREQUENCY: Poisson, Negative Binomial
- SEVERITY: LogNormal, Weibull, Pareto, Gamma.
Apply STATISTICAL TESTS to determine the goodness of fit, e.g. Chi-squared Test or Kolmogorov-Smirnov statistic.
If there is sufficient volume of losses, consider fitting a number of different distributions to different parts of the overall loss range.
Estimate the next 3 years’ reinsurance risk premium under different reinsurance structures using a frequency-severity simulation model.
Step 2: Determine exposures for the next 3 years
Using the company’s database with exposures, assumptions regarding for new business numbers over the next 3 years can be determined.
Estimate the next 3 years’ reinsurance risk premium under different reinsurance structures using a frequency-severity simulation model.
Step 3: Simulation for a specific reinsurance structure
- Simulate Claims experience in each year based on exposures for each year.
- Re-run simulation a number of times. Each simulation will produce its own estimate of the number of claims and a corresponding set of claim amounts.
- For each simulation, apply excesses, limits and deductibles to determine total reinsurance.
- The average reinsurance recovery over all simulations in a particular year plus a loading for catastrophe and large claims would give an estimate for the reinsurance risk premium.
4 potential challenges faced in comparing claims development patterns between two different agents writing the same class of business.
The agents may use different definitions of when a claim is first recorded, e.g.
when first notified or when all supporting evidence has been provided.
One agent may remove claims from its monthly listing if it settles as null, where the other may keep the claim on the listing with a zero amount.
The agents may have different approaches to setting CASE RESERVES, e.g. one might put a conservative/full reserve on notification, whereas the other may put a token or zero reserve until investigations have been completed.
They will have different processes for SETTLING A CLAIM, so the speed of paid settlement will vary between agents.
Whilst both agents write the same class of business, they may operate in DIFFERENT GEOGRAPHICAL MARKETS and sell to different TARGET MARKETS.
Each agent may use DIFFERENT SALES CHANNELS resulting in a different mix of underlying risk with different development patterns.
ADVANTAGES of using a stochastic ALM relative to a
deterministic approach for the purpose of setting investment strategy.
A stochastic model is better for considering a BIGGER SET OF POSSIBLE SCENARIOS (usually several thousand) while a deterministic ALM can only practically consider a few scenarios.
The scenarios in the stochastic model are chosen randomly and therefore not subject to the modeller’s potential biases and limited perspective.
A stochastic model’s outputs incorporate probabilities (thus the likelihood of unfavourable outcomes associated with particular investment strategies), while the output from a deterministic model does not.
Both stochastic and deterministic ALMs can allow for suitable interaction between assets and liabilities, so this in itself is not a difference.
Risk-based solvency regimes like SAM and Solvency II require a better understanding of risks (including investment risks) faced by the company, which are better modelled by a stochastic ALM.
DISADVANTAGES of using a stochastic ALM relative to a
deterministic approach for the purpose of setting investment strategy.
Stochastic models may have higher MODEL RISK (due to greater complexity) and introduce spurious accuracy in the modelling.
Practical difficulties are greater for stochastic models:
Stochastic models are more DIFFICULT (requiring expertise) TO BUILD, CALIBRATE AND RUN.
They REQUIRE MORE DATA than deterministic models.
More COSTLY TO OBTAIN (build or purchase) and to maintain.
The output may be more DIFFICULT TO INTERPRET.
More time consuming to run.
Explain the purpose of an ESG within an ALM exercise
An ESG typically takes the form of a specialised asset model that stochastically models various asset classes.
The output from an ESG includes the performance of each economic variable (e.g. inflation, asset class returns, GDP etc.) at each future projection point for several simulations.
This table of simulation outputs will be used as an input for the ALM