Statistics& Financial Modelling Flashcards
Quantitative Finance
Random Experiment
Random Experiment – a process leading to an uncertain outcome
Basic Outcome
Basic Outcome – a possible outcome of a random experiment
Sample Space (S)
Sample Space (S) – the collection of all possible outcomes of a random experiment
Event (E)
Event (E) – any subset of basic outcomes from the sample space
Intersection of Events
Intersection of Events – If A and B are two events in a sample space S, then the intersection, A ∩ B, is the set of all outcomes in S that belong to both A and B
Mutually Exclusive Events互斥
A and B are Mutually Exclusive Events if they have no basic outcomes in common
i.e., the set A ∩ B is empty
Union of Events
Union of Events – If A and B are two events in a sample space S, then the union, A U B, is the set of all outcomes in S that belong to either A or B
Collectively Exhaustive
Events E1, E2, …,Ek are Collectively Exhaustive events if E1 U E2 U . . . U Ek = S
- i.e., the events completely cover the sample space
Complement
The Complement of an event A is the set of all basic outcomes in the sample space that do not belong to A. The complement is denoted A
Let the Sample Space be the collection of all possible outcomes of rolling one die: S = [1, 2, 3, 4, 5, 6]
Let A be the event “Number rolled is even”
Let B be the event “Number rolled is at least 4”
Then
A = [2, 4, 6] and B = [4, 5, 6]
Q: Complements、Intersections、Unions、Mutually exclusive、Collectively exhaustive?
Mutually exclusive:
- A and B are not mutually exclusive
- The outcomes 4 and 6 are common to both
Collectively exhaustive:
- A and B are not collectively exhaustive
- AUB doesnotcontain1or3
Probability
Probability – the chance that an uncertain event will occur (always between 0 and 1)
0 ≤ P(A) ≤ 1 For any event A
Assessing Probability Methods
There are three approaches to assessing the probability of an uncertain event:
- classical probability
- relative frequency probability
- subjective probability
Classical Probability Method
Assumes all outcomes in the sample space are equally likely to occur
Classical probability of event A:
Permutations
Permutations: the number of possible arrangements when x objects are to be selected from a total of n objects and arranged in order [with (n – x) objects left over]
Conditional probability
A conditional probability is the probability of one event, given that another event has occurred:
Statistical Independence
- Two events are statistically independent if and only if: P(A∩ B)= P(A)P(B)
Events A and B are independent when the probability of one event is not affected by the other event
- If A and B are independent, then
P(A|B)= P(A),if P(B)>0
P(B|A)= P(B),if P(A)>0
Joint and Marginal Probabilities
Odds
- The odds in favor of a particular event are given by the ratio of the probability of the event divided by the probability of its complement
- The odds in favor of A are: below
Bayes’ Theorem
对于贝叶斯公式,记住AB AB AB,然后再做分组:”AB = A×BA/B”。
贝叶斯定理虽然只是一个概率计算公式,但其最著名的一个用途便是“假阳性”和“假阴性”检测。
Overinvolvement Ratio
Using a Tree Diagram
Random Variable
Represents a possible numerical value from a random experiment
Discrete Random Variable
Takes on no more than a countable number of values
Continuous Random Variable
- Can take on any value in an interval
- Possible values are measured on a continuum
Probability Distributions for Discrete Random Variables
Let X be a discrete random variable and x be one of its possible values.
- The probability that random variable X takes specific value x is denoted P(X = x)
- The probability distribution function of a random variable is a representation of the probabilities for all the possible outcomes.
- Can be shown algebraically, graphically, or with a table.
Probability Distribution Required Properties
- 0 ≤ P(x) ≤ 1 for any value of x
- The individual probabilities sum to 1;
Cumulative Probability Function
The cumulative probability function, denotedF(x0), shows the probability that X does not exceed the value x0.
F(x0)=P(X≦x0)
Where the function is evaluated at all values of x0.
Derived Relationship
The derived relationship between the probability distribution and the cumulative probability distribution.
Let X be a random variable with probability distribution P(x) and cumulative probability distribution F(x0). Then
Derived Properties
Derived properties of cumulative probability distributions for discrete random variables.离散随机变量累积概率分布的导出性质。
Let X be a discrete random variable with cumulative probability distribution F(x0). Then
- 0 ≤ F(x0) ≤ 1 for every number x0
- for x0 < x1, then F(x0) ≤ F(x1)
Properties of Discrete Random Variables
Expected Value (or mean) of a discrete random variable X:
Variance and Standard Deviation( Formula)
Functions of Random Variables
If P(x) is the probability function of a discrete random variable X , and g(X) is some function of X , then the expected value of function g is
Linear Functions of Random Variables
- Let random variable X have mean μx and variance σ2x
- Let a and b be any constants.
- Let Y = a + bX
- Then the mean and variance of Y are
μY= E(a+bX)=a+bμx
σ2Y= Var(a+bX)=b2σ2X
- so that the standard deviation of Y is
σY= |b|σx
Properties of Linear Functions of Random Variables
- Let a and b be any constants.
- a) E(a)=a and Var(a)=0
i. e., if a random variable always takes the value a, it will have mean a and variance 0
* b) E(bX)= bμx and Var(bX)= b2σ2x
i. e., the expected value of b·X is b·E(x)
Probability Distributions
Bernoulli Distribution伯努利分布=二项分布The Binomial Distribution
* Consider only two outcomes: “success” or “failure”
- Let p denote the probability of success
- Let 1 – p be the probability of failure
- Define random variable X:
x = 1 if success, x = 0 if failure
- Then the Bernoulli probability distribution is
P(0)=(1–p) and P(1)=p
Mean and Variance of a Bernoulli Random Variable
The mean is μx = p
The variance is σ2x = p(1 – p)
Developing the Binomial Distribution
Binomial Probability Distribution
- A fixed number of observations, n
- e.g., 15 tosses of a coin; ten light bulbs taken from a warehouse
- Two mutually exclusive and collectively exhaustive categories
- e.g., head or tail in each toss of a coin; defective or not defective light bulb
- Generally called “success” and “failure”
- Probability of success is P , probability of failure is 1 – P
- Constant probability for each observation
- e.g., Probability of getting a tail is the same each time we toss the coin
- Observations are independent
- The outcome of one observation does not affect the outcome of the other
The Binomial Distribution
P(x) = probability of x successes in n trials,
with probability of success p on each trial
x = number of ‘successes’ in sample,(x = 0, 1, 2, …, n)
n = sample size (number of independent trials or observations)
p = probability of “success”
Shape of Binomial Distribution
The shape of the binomial distribution depends on the values of p and n
*Mean and Variance of a Binomial Distribution
Using Binomial Tables
The Hypergeometric Distribution
- “n” trials in a sample taken from a finite population of size N
- Sample taken without replacement
- Outcomes of trials are dependent
- Concerned with finding the probability of “X” successes in the sample where there are “S”successes in the population
* Hypergeometric Probability Distribution
Example :Using the Hypergeometric Distribution
- 3 different computers are checked from 10 in the department. 4 of the 10 computers have illegal software loaded. What is the probability that 2 of the 3 selected computers have illegal software loaded?
Jointly Distributed Discrete Random Variables
联合分布式离散随机变量
Properties of Joint Probability Distributions
Properties of Joint Probability Distributions of Discrete Random Variables
Let X and Y be discrete random variables with joint probability distribution P(x, y)
- 0 ≤ P(x, y) ≤ 1 for any pair of values x and y
- the sum of the joint probabilities P(x, y) over all possible pairs of values must be 1
*Conditional Probability Distribution
Independence
Conditional Mean and Variance
Covariance
*Correlation
Covariance and Independence
- The covariance measures the strength of the linear relationship between two variables
- If two random variables are statistically independent, the covariance between them is 0
- The converse is not necessarily true
*Portfolio Analysis, mean, variance.
- Let random variable X be the price for stock A
- Let random variable Y be the price for stock B
- The market value, W, for the portfolio is given by the linear function
W= aX+ bY
(a is the number of shares of stock A, b is the number of shares of stock B)
*Example: Investment Returns
*Example:Portfolio
*Interpreting the Results for Investment Returns
Continuous Random Variables
- A continuous random variable is a variable that can assume any value in an interval
- thickness of an item
- time required to complete a tasktemperature of a solution
- height, in inches
- These can potentially take on any value, depending only on the ability to measure accurately.
Cumulative Distribution Function
*Probability Density Function
The probability density function, f(x), of random variable X has the following properties:
Probability as an Area
The Uniform Distribution统一分部
The uniform distribution is a probability distribution that has equal probabilities for all equal-width intervals within the range of the random variable
*Expectations for Continuous Random Variables
- The mean of X, denoted μx , is defined as the expected value of X
μx= E[X]
- The variance of X, denoted σx2 , is defined as the expectation of the squared deviation, (X - μx)2, of a random variable from its mean
σx2 =E[(X-μx )2]
*Mean and Variance of the Uniform Distribution
*Linear Functions of Random Variables
The Normal Distribution
- Bell Shaped
- Symmetrical
- Mean, Median and Mode are Equal
- Location is determined by the mean, μ
- Spread is determined by the standard deviation, σ
- The random variable has an infinite theoretical range:+∞ to -∞
- The normal distribution closely approximates the probability distributions of a wide range of random variables
- Distributions of sample means approach a normaldistribution given a “large” sample size
- Computations of probabilities are direct and elegant
- The normal probability distribution has led to good business decisions for a number of applications
Many Normal Distributions
By varying the parameters μ and σ, we obtain different normal distributions
The Normal Distribution Shape
*The Normal Probability Density Function
The formula for the normal probability density function is
*Cumulative Normal Distribution
* Finding Normal Probabilities
*TheStandard Normal Distribution
Comparing X and Z units
Note that the distribution is the same, only the scale has changed. We can express the problem in original units (X) or in standardized units (Z)
Appendix Table 1
- The Standard Normal Distribution table in the textbook (Appendix Table 1) shows values of the cumulative normal distribution function.
- For a given Z-value a , the table shows F(a). (the area under the curve from negative infinity to a )
*The Standard Normal Table
General Procedure for Finding Probabilities查找概率的一般程序
To find P(a < X < b) when X is distributed normally:
- Draw the normal curve for the problem in terms of X;
- Translate X-values to Z-values;
- Use the Cumulative Normal Table.
*Finding Normal Probabilities
Suppose X is normal with mean 8.0 and standard deviation 5.0;
Find P(X < 8.6)
*Upper Tail Probabilities
Suppose X is normal with mean 8.0 and standard deviation 5.0.
Now Find P(X > 8.6)
*Finding the X value for a Known Probability
Steps to find the X value for a known probability:
- Find the Z value for the known probability
- Convert to X units using the formula:
xa=μ+zaσ
Example:
- Suppose X is normal with mean 8.0 and standard deviation 5.0.
- Now find the X value so that only 20% of all values are below this X