Exam Prep Flashcards
Pairwise Independent
All pairs in a set are equal in probability.
P(A1)P(A2) = P(A1 n A2).
Mutual Independence
Mutual independence is everything is independent together plus the pairwise independence.
Normal Distribution: increase μ in pdf
Shifts normal distribution by distance μ in the positive direction.
Normal Distribution: decrease μ in pdf
Shifts normal distribution by distance μ in the negative direction.
Normal Distribution: increase σ in pdf
Stretches the distribution and
lowers the maximum of the function.
Normal Distribution: decrease σ in pdf
Narrows the distribution and
heightens the maximum of the function.
Reflexive
a = a.
Symmetric
a, b = b, a.
Transitive
a, b and b, c implies a, c.
Sample Standard Deviation
sqrt(1/n-1 x (sum(x - meanx)^2))
Sample Pearson Correlation
1/n-1 x ((sum(x-meanx)(y-meany)) / sx x sy)
where
sx = sample standard deviation of x
sy = sample standard deviation of y
Isometry Matrix
An isometry is any map f : R^m -> R^m that is defined on the whole space R^m and preserves the Euclidean L_2-metric.
Isometric Transformations
Reflection
Rotation
Translation
Orthogonal transformation
Orthogonal Transformation
A transformation with preserves a symmetric inner product.
Aka, it preserves lengths of vectors and angles between vectors.
Reduce dimensionality
- SVD
- PCA
- (if two-dimensional) linear regression methods
Standarised z-score
X - μ / σ
X = random variable
σ = standard deviation
μ = mean
Significance Level
x - μ / σ - sqrt(n)
x = given number (like an estimated mean)
σ = standard deviation
μ = mean
m = amount of data given (like amount of employees etc)
Variance
σ^2
E(X^2) - (E(X))^2
Standard Deviation
sqrt((sum(x - meanx)^2)/N)
Sample Variance
(sum(x - meanx)^2)/N-1
Covariance
E(XY) - (EX)(EY)
1/N x sum(x-meanx)(y-meany)
Sample covariance
1/N-1 x sum(x-meanx)(y-meany)
Pearson product-moment correlation
cov(X,Y) / (σ_X)(σ_Y)
Size of a vector
|u|
sqrt(all values ^2 and then added together).
Angle between vectors
angle = arccos(dot(A,B) / (|A|* |B|))
Scalar Product
Let vector a = {1,2,3}
and vector b = {4,5,6}
Then a . b = (1 x 4) + (2 x 5) + (3 x 6)
Finding orthogonal vectors
Lets say we have u = {1,2,3}
To find an orthogonal vector v to u, it must satisfy:
a) u . v = 0, or as rewrote by b,
b) a + 2b + 3c = 0 (example specific) where v = {a,b,c}
In this instance, we can have v = {0, -3, 2} since 0 - 6 + 6 = 0.
If we then apply a, we get:
(1 x 0) + (2 x -3) + (3 x -2) = 0
How to find if matrix is orthogonal, and therefore isometric
For matrix a:
a)
if A^T x A = I
or in plain text, if the transpose of the matrix multiplied by the matrix equals the identity matrix.
A way to figure this out is by checking if the columns of A are length 1.
b)
preserves scalar products.
We are just now checking if the columns are orthogonal to each other.
Therefore, we multiply the columns together, and if they all equal 0, then as long as a is also fulfilled, we have an isometry.
Sample correlation coefficient
sum((x-meanx)(y-meany)) / sqrt(sum((x-meanx)^2) x (sum((y-meany)^2))
Proving something is a metric
- distance is never negative, and the only way to have 0 distance is to point to itself
- distance is symmetric
- triangle inequality holds (a + b >= c)
Single Linkage
Merge trees based upon minimum difference between text.
Complete Linkage
Merge trees based upon maximum difference between text.
Hamming Distance
Binary difference.
Manhattan Metric
Denoted as L_1(p, q), and is calculated by:
sum(|p_i - q_i|)
Euclidean Metric
Denoted as L_2, and is calculated by:
sqrt(sum(p_i - q_i)^2)
Max Matric
Denoted as L_∞, and is calculated by:
max (|p_i - q_i|
Is matrix a bijective?
If det(a) != 0, then matrix a is bijective.
Det() for 2x2
ad - bc
Det() 3x3
a11a22a33 + a12a23a32 + a13a21a32 - a13a22a31 - a12a21a33 - a11a21a32
Det() signed area of a triangle
Coordinates of triangle - (x1, y1) (x2, y2) (x3, y3)
1/2 x det(
x2 - x1 x3 - x1
y2 - y1 y3 - y1
)
Convex Hull
The shape of the smallest convex set that contains it.
Convex Set
A subset that intersects every line into a single line segment.
Invariant
Lets say you have apples. If we say “if it is an apple, then it is a fruit”, then we can say that fruit is an invariant. Hence, invariants are an true if statement.
Complete invariant
Saying “if A is an apple, then it is produced from apple trees” is a complete invariance statement, because then we can say “if it is produced from an apple tree, then it is an apple” which is also a true statement.
Trace of a matrix
All diagonal elements summed.
If we had:
1 2 3
4 5 6
7 8 9
Then the trace would be 1 + 5 + 9.
Eigenvalue
If v satisfies:
Av = λv
then v is an eigenvalue.
A is the matrix, v is the eigenvector, λ is the eigenvalue.
Det() with eigenvalues
det(A - λI) = 0
where A is the matrix, λ is the eigenvalue, I is the identity matrix.
Finding eigenvectors / eigenvalues
From matrix a, we can
1) find eigenvalues by finding the determinant of the matrix with every diagonal element subtracted by lambda, which needs to equal 0.
2) find eigenvectors as part of a solution of (A - λI) = 0.
Change in linear basis
Lets say we had a basis RB which has v1 = 3e1 + e2, v2 = -2e1 + e2.
If we wanted to translate a specific vector, lets say (2, 1), from RB to GB, we would:
1) make the equation for the vector 2v1 + v2.
2) 2v1 + v2 = (3e1 + e2) + (-2e1 + e2)
3) multiply 2v1 and 3e1, and 2v1 and e2 to give us 6e1 + 2e2.
4) multiply v2 and -2e1, and v2 and e2 giving us -2e1 + e2.
This results in (6e1 - 2e1) + (2e2 + e2) = 4e1 + 3e2
Applying linear map with linear basis
1) translate RB coordinates to GB : B_rv.
2) apply the map GB : AB_rv
3) translate result back from GB to RB : (B^-1 AB)rv.
Steps of PCA
1) subtract means from respective values
2) find the covariance matrix by creating a matrix S consisting of results of x being on the top or left, and results of y being on the bottom or right. We then apply the formula
SS^T / N-1.
3) find eigenvalues and eigenvectors.
4) sort the eigenvectors.
5) pick the top eigenvectors and transform the normalised dataset by multiplication
Multiply 2x2 matrix
row1column1 row1column2
row2column1 row2column2
Multiply 2x3 matrix
row1column1 row1column2
row2column1 row2column2
Multiply 3x3 matrix
row1column1 row1column2 row1column3
row2column1 row2column2 row2column3
row3column1 row3column2 row3column3
Inverse of 2x2
- swap a and d
- negative b and c
- divide every number by det(original)