Discrete Random Variables Flashcards
Continuous Random Variable
If Fx is continuous and differentiable with derivative fx, then X is a continuous random variable. fx is the probability density function of X
Discrete Random Variable
Suppose Ω is a countable sample space. Then a function X: Ω-R is called a discrete random variable
Joint Distribution Function
F(x1,…,xn)=P(X1
Marginal Distribution Function
Distribution function of a single random variable
Joint Probability Mass Function
p(x1,…,xn)=P(X1=x1,…,Xn=xn) where X1,…,Xn are discrete random variables
p-value
The observed significance level of a test where the probability of obtaining a value of the test statistic is at least as extreme as that observed under H₀
Power
1-β
Measures the test’s ability to detect a departure from H₀ when it exists
Type 1 error
When we reject H₀ when it is true. This has probability α
Type 2 error
When we accept H₀ when it is in fact false. This has probability β
Linear Statistical Model
The regression curve is a linear function of the parameters in the model
Regression Curve
μ(x) as a function of x
Binomial distribution
X is the number of successes recorded during a sequence of n>1 independent Bernoulli trials, each with the same probability of success
The geometric distribution
X is the number of Bernoulli the trials required before a success is observed
The negative binomial distribution
X is the number of independent Bernoulli trials required before r>1 successes have been observed
Hypergeometric distribution
X is the number of objects having a particular attribute when we draw n objects at random from a population of N, of which M have the attribute of interest
Poisson distribution
X is the number of ‘accidents’ occurring during a time period of fixed duration
Exponential distribution
Starting at times zero let X be the time until the first accident occurs with the press on process rate of Lambda
The gamma distribution
The time until the kth accident in a Poisson process
Beta distribution
A random variable that represents a proportion measured on some continuous scale
Central limit theorem
If X1X2 are independent random variables having a common distribution with means μ invariance sigma squared then X by has approximately normal distribution with me and me you invariant sigma squared divided by N for a large n
Estimation
A statistic whose realised value is taking is the estimate of some unknown parameter
Null Hypothesis
On assumption about a parameter which we wish to test on the basis of available data
Alternative Hypothesis
Supported when the data does not support H₀
Test Statistic
A distribution known under H₀ and ‘large’ values of |T| are inconsistent with H₀
Assumptions for Hypothesis Test
Normality (sample means)
Independence
Sample Variances are scaled chi-square
In large samples they hold - Central Limit Theorem
Least Squares Method Assumptions
σ² constant
Independent