Final Study Cards Flashcards
What is a scalar valued function?
A scalar valued function is a function which maps R^n onto R
When is a scalar valued function linear?
A scalar valued function is linear if it has the form f(x1,…,xn) = a1x1 + a2x2 + ··· + anxn
i.e., it is affine with b = 0, or equivalently with f(0) = 0.
When is a scalar valued function affline?
A scalar valued function is affline if it has the form f(x1,…,xn) = a1x1 + a2x2 + ···+ anxn + b for some numbers a1,…,an,b (so b = f(0)).
What is the derivative matrix of a vector valued function?
The derivative matrix of a vector valued function f(x) = {f_1(x), f_2(x), f_3(x),…,f_n(x)
is the matrix such that
| f_1/dx_1 f_1/dx_2 f_1/dx_3 … f_1/dx_n|
| f_2/dx_1 f_2/dx_2 f_2/dx_3 … f_2/dx_n|
| f_3/dx_1 f_3/dx_2 f_3/dx_3 … f_3/dx_n|
| …
| f_n/dx_1 f_n/dx_2 f_n/dx_3 … f_n/dx_n|
f_1/dx_1 f_1/dx_2 f_1/dx_3 … f1/dx_n|
What are the two linear approximations for a function F at x using the derivative matrix (where a is a close value)?
f(x) ≈f(a) + ((Df)(a))(x −a)
and
f(a + h) ≈f(a) + ((Df)(a)) h
What is the linearity principle?
for c1,c2 ∈R and v1,v2 ∈R^2 we have f(c1v1 + c2v2) = c1f(v1) + c2f(v2).
When is a function g linear?
When g(cx) = cg(x), g(x + y) = g(x) + g(y)
What is the rotation matrix for R^2?
Aθ =
|cos θ −sin θ|
|sin θ cos θ|
.
What is the identity matrix and what are it’s properties?
The identity matrix is an n x n square matrix such that its entries are equal to
|1 0 0 … 0|
|0 1 0 … 0|
|0 0 1 … 0|
|… |
|0 0 0 … 1|
The special property of the identity matrix is that any m x n matrix A multiplied by its respective n x n identity matrix is equal to A
so A*I_n = A
What are the important properties of matrix multiplication
(MM1) It recovers matrix-vector multiplication: if A is an m ×n matrix, and x ∈ R_n is thought of as an n ×1 matrix, the matrix-matrix product Ax is the same as the matrix-vector product.
(MM2) A(B + C) = AB + AC and (A′ + B′)C′ = A′C′ + B′C′. (These “distributive laws” are the
reason we call it matrix multiplication.)
(MM3) A(BC) = (AB)C, and A(cB) = (cA)B = c(AB) for any scalar c. In particular, taking C
to be an m ×1 matrix that is a column vector v by another name, A(Bv) = (AB)v.
(MM4) If A is an m ×n matrix, then I_mA = A = AI_n, where I_m is the m × m identity matrix and I_n is the n ×n identity matrix.
What are some important assertions about matrix multiplication
AB != BA
AB = AC does not imply B = C
How does one set up and solve a markov chain problem?
First create a matrix M which models the total of each team after each cycle. For the n_th step in the cycle, the number at each position equals to M^n-1th * the original values
What is the multivariable chain rule at a point = (v1,…,vn) ∈ R^n?
(D(f ◦ g))(v) = (Df)(g(v)) (Dg)(v)
What is the definition of an inverse matrix B to a matrix A?
The inverse matrix B is a matrix such that BA = I_n and AB = I_n
If A is invertible, what is Ax = b equal to?
x = (A^-1)(b)
It is important to note that the position of the A is maintained relative to the vector it is multiplying (so x != (b)(A^-1))
What is true of n x n matrices with respect to their inverse?
If A and B are n × n matrices that satisfy AB = In then A is invertible and B is its
inverse; i.e., automatically the other equation BA = In holds.
How can you immediately tell if a set of vectors is not linearly independent?
If it has a vector which is not non-zero
When is a vector non-zero
When it has at least one entry which is not zero
The upper left to lower right diagonal of the inverse of a square matrix is always what
1/The original value of the position in the diagonal (aka the reciprocal)
What is the determinant of an upper or lower triangular matrix?
The product of its diagonal entries
When is a 2x2 matrix inversible?
When it’s determinant is not equal to 0
What is the determinant of a 2x2 matrix
The determinant is ad-bc where the slots are assigned
| a b |
| c d |
When is a matrix A invertible?
A is invertible precisely when Ax = 0 has x = 0 as its only solution, and A is non-invertible precisely when Ax = 0 has a nonzero solution.
What are some important matrix inversion rules?
(i) When A is invertible you can “cancel A” by multiplying both sides by A−1 (but there is a
caveat; see the Warning below):
– Cancelling an invertible matrix on the left: if AB = AC and A is invertible then
B = C. This holds because you multiply both sides on the left by A−1.
– Cancelling an invertible matrix on the right: if BA = CA and A is invertible then
B = C. For this you need to multiply both sides on the right by A−1.
– Warning: our caveat is that if AB = CA, then you cannot cancel A on the left on one
side and on the right on the other, so you cannot conclude in this case that B = C, even
when A is invertible (see Example 18.4.1 below).
(ii) If A and B are both invertible n ×n matrices then AB is also invertible, and (AB)−1 =
B−1A−1 (note the switch of order of multiplication on the right side!); see Example
18.4.3 below for an illustration.
What is the inverse matrix of a 2x2 matrix (if it is invertible)?
It’s inverse matrix is
(1/ad-bc)|d -b|
|-c a|
where the original matrix is
|a b|
|c d|
Newton Whatever Whatever Convergence Thing
Add It Maybe
When is a collection of vectors considered linearly independent?
When none of the vectors belong to the span of the others
When are a collection of vectors linearly independent
A collection of vectors v_1,…,v_k ∈R_n is linearly independent precisely when the
only collection of scalars a_1,…,a_k for which
a_1v_1 + a_2v_2 + ···+ a_kv_k = 0
is a_1 = 0,a_2 = 0,…,a_k = 0.
How do you do the Gramm-Schmidt process for a collection of vectors v_1, v_2, …, v_n
Let v1,…,vk be nonzero n-vectors with span V in Rn.
Let w1 = v1 and define B1 to be {w1}(an orthogonal basis for V1!).
Let w2 = v2 −Projw1(v2)
Let w3 = v3 −Projw_2(v3) - Projw_2(v3)
and so on, defining at the jth step wj = vj − ProjVj−1(vj)
How do you determine if a set of vectors {v1,v2,…,vk} are linearly independent?
If dim(span({v1,v2,v3}) = k
What is the dimension of the orthogonal compliment to linear subspace v∈R_n?
The dimension of the orthogonal compliment is n - dim(V)
What is the transpose of a matrix?
It is the resulting vector when you flip the values of the matrix across the upper left to lower right diagonal of the original matrix
What does the transpose allow you to do?
The transpose allows you to move a matrix across a dot product, so for example
(Ax) dot y = x dot (A_Transpose(y))
What are the important properties of matrix algebra
A(B + C) = AB + AC, (A + B)C = AC + BC, A(BC) = (AB)C, and
AIn = A = ImA for an m ×n matrix A. But AB 6= BA in general!
* (From before) Sometimes matrices are invertible. When they are invertible, you can multiply
by their inverse to cancel them: for example, if AB = AC and A is invertible then B = C,
whereas if AB = CA (with invertible A) then you can’t conclude anything.
* (From before) Invertible matrices are always square; i.e., n×n for some n. Inversion reverses
the order of multiplication: (AB)−1 = B−1A−1 if A and B are both invertible.
* (New) The transpose of an m×n matrix is an n×m matrix, and transpose reverses the order
of multiplication: (AB)> = B>A>.
* (New) If A is invertible so is A>, and (A>)−1 = (A−1)>.
* (New) For v,w ∈Rn viewed as n×1 matrices, the 1 ×1 matrix product v>w equals [v·w].
This yields efficient manipulation of dot products of many vectors at once via matrix algebra.
What are the properties of a symmetric matrix
A_Transpose = A and the matrix is square (i.e. n x n). The inverse of a symmetric matrix is always symetric
When is a given matrix A orthogonal?
When its transpose times it is equal to the identity matrix ((A_transpose)*A = I_ n)
or
The n columns of A are an orthonormal collection of m-vectors
What is the inverse of an orthogonal matrix?
Its transpose
Given matrices A and B are orthogonal, what is true of their product?
Their product regardless of order will be orthogonal
What is the column space of a matrix?
The span of its columns
When is an upper or lower triangular matrix invertible?
When all its diagonal entries are non-zero
How does one solve an equation given the LU decomposition?
Say that L(Ux) = b and then say that Ux = y = {y1,y2,…,yn} and solve for Ly = b using back substitution.
Then replug in the values of Ux into the equation to solve via forward substitution
How does one solve an equation given the QR decomposition?
Since Q is orthogonal we can say that
QRx = b => Rx = (Q_Transpose)b
Solve this to get the answer
How do you solve the inverse of a matrix given its LU decomposition?
The inverse of a matrix given it’s LU decomposition is equal to L^-1U^-1
To solve these state that L’ and U’ are the inverses with the recipricol values along the diagonal and the rest as unknown of L and U, and then say that L’L = I and U’U = I solving for the unknowns using backwards or forwards substitution
How do you solve the inverse of a matrix given its QR decomposition?
You know how to do it you twat
How to find the QR decomposition of a matrix?
Given a matrix of form A = {v1,v2,v3,…,vn} run gramm-schmidt on the given vectors to get w1,w2,w3,…,wn. Q = the orthonormal form of these vectors
To find R write v1,…,vn as their equivalent form in terms of w1,..,wn
Multiply the coefficients of the given vn equation by the magnitude of wn is equivalent to the column vn