Math Flashcards
1
Q
inverse of matrix
A
AX = B
A-1AX = A-1B
IX = A-1B
X = A-1B
2
Q
outer product
A

3
Q
matrix operations
A
- Associativity of Addition: A + ( B + C ) = ( A + B ) + C
- Associativity of Scalar Multiplication: (cd) A = c (dA)
- Distributive: c(A + B) = cA + cB
- Distributive: (c + d) A = cA + dA
- Associativity of Multiplication: A(BC) = (AB)C
- Left Distributive: A(B + C) = AB + AC
- Right Distributive: (A + B )C = AC + BC
- Scalar Associativity / Commutativityc (AB) = (cA) B = A (cB) = (AB) c
- Multiplicative IdentityIA = AI = A
4
Q
add large number
A
take the log of the number
5
Q
statistical power
A
- is the probability that the test rejects the null hypothesis H0
- power=(1-β)=Pr(reject H0| H1 is True)
- high power = low probability of type II error
- the alpha significance criterion (α)
- statistical power, or the chosen or implied beta (β)
6
Q
statistical power calculation
A
- no difference
- test statistic: Tn
- µ0=0
- find Tn where alpha > 0.05
- find where alpha is met using quantile fn
- test statistic: Tn
- B(Ø)=Pr(Tn>value|µD=Ø)
- greater than flipped with 1 -
- B(Ø)=1-φ(Tn-Ø/(σD/√n)), for 1 sample t-test
7
Q
one sample t-test (paired t-test)
A

8
Q
two sample unpooled t-test
A

9
Q
confidence
A
- confidence = 1 - alpha, where alpha = significance
- .95 = 1 - alpha = ∑N(x, σ2/N)
10
Q
Gaussian confidence interval
A
- CI = µ±zleft(σ/√N)
- for CI = 0.95, z = ± 1.96
- z-score = (x-µ)/(σ/√N)
11
Q
bernouilli confidence interval
A
- binary case
- p = successes/N
- CI = p ± z√(p(p-1)/N)
12
Q
wilson interval
A
- p = successes/N
- (p+z2/2N)/(1+z2/N) ± z/(1+z2/N) √(p(1-p)/N + z2/4N2)
13
Q
conjugate priors
A
- pairs of likelihood and and priors such that the posterior has the same distribution as the prior
- precision: λ-1=σ2
- likelihood: X~N(µ, τ-1)
- conj prior for Gaussian likelihood is Gaussian
- prior: µ~N(m0, λ0-1)
- proportionality for posterior
14
Q
lift
A
- p(A, B)/p(A)p(B) = p(A | B)/p(A) = p(B | A)/p(B)
- if A and B are independent, lift = 1
15
Q
bayesian approach
A
1. start with likelihood: p(X|ø) = Πp(Xi|ø)
- choose prior: p(ø)
- calculate posterior: P(ø|X)=p(X|ø)p(ø)
- posterior update given normal likelihood & prior
- likelihood: N~(µ,τ-1)
- prior: N ~(m,λ-1)
- posterior mean: m’ = (λm+τ)∑x/(λ+Nτ)
- update: λ’ = λ + Nτ
- posterior update given normal likelihood & prior
- Bernoulli has p, Gaussian has µ, σ
- p(x|µ, σ) = exp((x-µ)2/2σ2)/√(2πσ2)
16
Q
sensitivity/recall
A
- true positive rate: TP / (TP + FN)
- cost of false negative high
- is a missile not coming?
17
Q
specificity
A
- true negative rage: TN / (TN + FP)
- cost of false positive high?
18
Q
precision
A
- TP / (TP + FP)
- how often its correct when it predicts True
- cost of false positive high
- skin cancer tests?
19
Q
accuracy
A
- TP+TN/(TP+TN+FP+FN)
- tells performance
- does not tell imbalance
20
Q
ROC
A
- logistic regression uses 0.5 threshold (P=50%)
- if you use different ones you get different true positive false positive rate
21
Q
f1 score
A
- 2*(precision*recall)/(precision+recall)
- good = low FP and low FN
- identifying threats but not disturbed
22
Q
4th order runge-kutta
A
- k1= h f (tn, yn)
- k2= h f ( tn + h/2, yn + k1/2)
- k3= h f ( tn + h/2, yn + k2/2)
- k4= h f ( tn + h, yn + k3)