CS2 - Part 3 Flashcards

1
Q

General formula for Cox proportional hazard (PH) model

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Ratio of hazards of lives with covariate vectors z1 and z2 (Cox PH model)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Proportional hazards model: Likelihood estimator for beta vector

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Aims of graduation

A
  • Produce smooth set of rates that are suitable for a particular purpose
  • Remove random sampling errors
  • Use the information available from adjacent ages
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Desirable features of graduation

A
  • Smoothness
  • Adherence to data
  • Suitability to purpose to hand
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Degrees of freedom for Xi-Squared test

A
  • Start with the number of groups
  • If the groups form a set of mutually exclusive and exhaustive categories (probabilities add up to 1), subtract 1
  • Subract further 1 for each parameter that has been estimated
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Distributions of D_x and mu~x

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Mortality experience: Deviation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Mortality experience: Standardised deviation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Degrees of freedom when comparing an experience with a standard table

A

Degrees of freedom = number of age groups

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Xi-squared failures: Standardised deviations test

A

To detect a few large deviations that the Xi-square test did not detect

Check if standardised deviations of mortality are following the standard normal distribution with Xi-Squared test

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Xi-squared failures: Signs test

A

To detect imbalance between negative and positive deviations

Binomial distribution

N number of negative deviations:

Check that 2*P(N <= x) > 5%

P number of positive deviations:

Check that 2*P(P >= x) > 5%

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Xi-squared failures: Cumulative deviations

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Xi-squared failures: Grouping of signs test

A

Detects ‘clumping’ of devations with the same sign.

Check ‘Grouping of signs test’ in tables.

If number of groups of positive (or negative) runs is lower or equal than the test statistic, we can reject the null hypothesis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Testing smoothness of graduation

A

Third difference (change in curvature) of the graduated quantities should

  • Be small in magnitude compared with the quantities themeselves
  • Progess regularly
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Methods of graduation

A
  • Graduation by parametric formula
    • a1 + a2 exp(a3x + a4x^2+…)
    • well-suited to the production of standard tables from large amounts of data
  • Graduation by reference to standard table
    • (a+bx) mu_x^s
    • Can be used to fit relatively small data sets where a suitable standard table exists
  • Gradution using spline functions
    • Method is suitable for quite small experiences as well as very large experiences.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Morality projection - Method based on expectation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Autocovariance function

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Simplify:

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Autocorrelation function

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Correlation formula

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Autoregressive process of order p

AR(p)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Moving average process of order q

MA(q)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Autoregressive moving average

ARMA(p,q)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Condition for stationarity of AR(p) process

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Conditions for invertibility of MA processes

A

Invertibility: White noise process e can be written explicitly in terms of X process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Moving average model MA(q), in backwards shift notation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

ARMA(p,q) process defined in Backward operation notation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Definition of an ARIMA process

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Features of MA(q) process

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

Features of AR(p) process

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Features of ARMA (p,q) process

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

Three possible causes of non-stationarity

A
  1. Deterministic trend (e.g. exponential or linear growth)
  2. Deterministic cycle (e.g. seasonal effect)
  3. Time series is integrated
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

Methods for compensating for trend/seaonality (6)

A
  • Least squares trend removal (Tables p.24)
  • Differencing
    • Differencing d times will not only make I(d) series stationary but will also remove linear trend
  • Seasonal differncing
    • E.g. differencing 12 times for annual seasonality
  • Method of moving averages
    • Create transformation such that transformed time series is moving average of original time series
  • Method of seasonal means
  • Transformation of the data
    • E.g. take log
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

Check if observed time series is stationary

A

Autocorrelation function should converge to 0 exponentially

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

Identification of white noise

A

Option 1:

  • Check if values of the SACF or SPACF fall outside the range of
  • +-2/sqrt(n) –> Approximated from +-1.96/sqrt(n)
  • Note that there is a chance of 1/20 that one value will fall out of the range (95% quantile)

Option 2:

  • Portmanteau test (tables p. 42)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

Identification of MA(q)

A
38
Q

Identification of AR(q)

A
39
Q

Identification of appripriate order of differencing (d) of sample data

A
  • Slowly decaing sample autocorrelation function indicates time series need to be differenced
  • Look for smallest sample variance for d=1,2,3,…
40
Q

Diagnostic checking for fitted ARIMA model

A
41
Q

Condition for stationarity of vector autoregressive process

A
42
Q

Calculate eigenvalues of matrix A

A

Values lambda, such that

det (A-lambda*I) = 0

43
Q

Two time series processes X and Y are called cointegrated if:

A
  • X and Y are I(1) random processes
  • there exists a non-zero vector (a,b) such that aX+bY is stationary

The vector (a,b) is called cointegration vector.

44
Q

Moment generating function (formula)

A
45
Q

Cumulant generating function

A
46
Q

Coefficient of skewness

A
47
Q

Kurtosis

A
  • Fourth standardised moment
  • kurtosis = 3: mesokurtic (normal distribution)
  • kurtosis >3 leptokurtic
    • more peaked, fatter tail
  • kurtosis <3 platykurtic
    • broader peak, more slender tails
48
Q

Standardised moment

A
49
Q

Varying volatility over time

A

heteroscedacity

50
Q

Central limit theorem

A
51
Q

Generalized extreme value distribution

A
52
Q

GEV distributions: Different values of shape parameter gamma

A
53
Q

Rough criteria to chose family of GEV distributions

A
54
Q

Distribution of excess above u

A
55
Q

kth moment of a continuous positive-valued distribution with density function f(x)

A
56
Q

Measures of tail weight

A
57
Q

Coefficient of upper tail dependence

A
58
Q

Coefficient of lower tail dependence in terms of the copula function

A
59
Q

Coefficient of upper tail dependence in terms of the copula function

A
60
Q

Fundamental copulas

A
61
Q

Graphical representation of independence copula

A
62
Q

Graphical representation of comonotonous copula

A
63
Q

Graphical representation of counter-monotonic copula

A
64
Q

Gumbel copula

A
  • Upper tail dependence determined by parameter alpha
  • No lower tail dependence
65
Q

Clayton copula

A
  • Lower tail dependence determined by alpha
  • No upper tail dependence
66
Q

Frank copula

A
  • Interdependence structure in which there is no upper or lower tail dependence
67
Q

Gaussian copula

A
68
Q

Archimedean copula

A
69
Q

Student’s t copula

A
70
Q

Tail dependence of all copulas

A
71
Q

PDF of the reinsurer’s claim amount under XOL with retention M

A
72
Q

Variance, mean and skewness of compound poisson process with parameter lambda

A
73
Q

Coefficient of skewness of compound poisson distribution

A
74
Q

Sum of independent compound Poisson random variables

A
75
Q

n choose k

A
76
Q

Machine Learning: Confusion matrices

A
77
Q

Machine Learning: Hyperparameters

A

Variables external to the model whose values are set in advance by the user. They are chosen based on the user’s knowledge and experience in order to produce a model that works well.

78
Q

Machine Learning: Parameters

A

variables internal to the model whose values are estimated from the data and are used to calculate predictions using the model.

79
Q

Machine Learning: Regularisation or penalisation

A
80
Q

Branches of Machine Learning

A
81
Q

Machine Learning: Stages of analysis

A
82
Q

Machine Learning: Data Types

A
83
Q

Machine Learning: Train-Validate-Test approach

A

Split data into

  • data for training (60%)
  • data for validation (20%)
  • data for testing (20%)
84
Q

Machine Learning: Requirements for analysis to be reproducible

A
  • Data used should be fully described and available to other researchers
  • Any modification to the data should be clearly described
  • Selection of the algorithm and the development of the model should be desribed (including parameters and why they are chosen)
  • Ideally would provide computer code used
  • Specify seed value
85
Q

Machine Learning: Penalised general linear models

A

Maximize penalized likelihood

86
Q

Machine Learning: Naive Bayes Classification

A
87
Q

Machine Learning: Gini index of a final node in a decision tree

A
88
Q

Machine Learning: Gini index of a decision tree

A
89
Q

Machine Learning: K-means clustering advantages and disadvantages

A
90
Q
A