Definitions Flashcards

1
Q

What is a Vector Space?

A

A Vector Space over a field K is a triplet consisting of a set of vectors, a vector addition and a scalar multiplication which stisfies the folloeing rules:
1. Closure of +: u + v in V
2. Closure of : av in V
3. Associativity of +: (u + v) + w = u + (v + w)
4. Associativity of : a(bv) = (ab)v
5. Commutativity of +: u + v = v + u
6. Commutativity of : (a + b)v = a
v + bv
7. Distribution of vectors over scalar: a(u + v) = a
u + av
8. Existence of zero vector s.t 0 + v = v
9. Existence of inverse additive vector: u + (-u) = 0
10. Multiplicative identity satisfies 1
v = v

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Give the definition of a distance function on a vector space.

A

A distance function is a map d: VxV -> [0, inf[ s.t:
1. d(u,v) >= 0 and d(u,v) = 0 iff u=v
2. d(u,v) = d(v,u)
3. d(u,w) <= d(u,v) + d(v,w) (triangle inequality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a metric vector space?

A

A metric vector space is a vector space equipped with a distance function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Give the definition of a norm.

A

A Norm on a vector space is a map ||||: V -> [0, inf[ st:
1. ||u|| >= 0, ||u|| = 0 iff u = 0
2. ||A
v|| = |a|*||u||
3. ||u + v|| <= ||u|| + ||v||

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the definition of an inner product?

A

An inner product on a V.S is a map <,>: VxV -> R st:
1. <u,v> = <v,u>
2. <au,v> = a<u,v>
3. <v + w,u> = <v,u> + <w,u>
4. <v,v> >=0 and is equal to zero iff v=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the definition of a subspace?

A

W in V is a subspace iff:
1. Closure under +
2. Closure inder *
3. 0 vector is a part of W

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the definition of a linear combination?

A

A Linear combination of a se of vectors is an expression of the form: a1v1 + a2v2 + … + an*vn = v

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the definition of the span of a set s in V?

A

Span(s) = [a1s1 + … + ansn | an in R and sn in S]. It is the set of possibilities that can be reached by linear combinations of the vectors in set S.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How can we determine if a set of vectors is independent?

A

If the linear combination of the set is equal to zero iff a1, …, an are all equal to zero.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the Basis of a Vector Space?

A

B in V is a Basis if:
1. Span(B) = V
2. B is independent.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the definition of the Dimension of a V.S?

A

The number of elements in a basis of a vector space is a called the dimension, Dim(V).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the matematical definition of two orthogonal vectors?

A

<u,v> = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the definition of a Projection of u onto v?

A

Proj.v(u) = (<u,v> * v)/<v,v>

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the Cauchy-Schwartz Theorem?

A

<u,v>^2 <= <u,u> * <v,v>
or
|<u,v>| <= ||u||*||v||

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the definition of an angle between two vectors?

A

cos(theta(u,v)) = <u,v>/(||u||*||v||)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the definition of a Linear Transformation ofver a vector Space?

A

A Linear Trasformation is a function T: V –> W which must satisfy:
1. T(v1 + v2) = T(v1) + T(v2)
2. T(av1) = aT(v1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is the Null space of a Linear Transformation?

A

Given a Linear Transoframtion T: V –> W, the Null Space or the Kernell of T as the set of elements in v that map t the zero vector :
N(T) = {v in V: T(v) = 0}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the Image of T?

A

The image is the span of Vectors of the Linear Transformation.
Im(T) = {w = T(v): v in V}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is the nullity of the linear transformation T: V –> W?

A

The nullity is the dimension of the Null space.
Note that Dim(V) = Null(T) + rank(T)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Give the definition of an Orthogonal Matrice? What characteristics does it preserve when used as a Linear Transformation?

A

An orthogonal matrice Q is a matrice with orthonormal columns. That is, its columns are both orthogonal and of norm 1. It preserve:
1. Inner product s.t (Qu)^T * (Qv) = u^Tv = <u,v>
2. Norms s.t ||Q
u|| = ||u||
3. Angles

21
Q

What is the Definition of a Full QR Decomposition?

A

Given A = [a1, a2, … an] a Matrix mxn, full column rank, the Full QR decomposition A=QR is the extension of the Reduced Form s.t

[a1, a2, … an] = [q1, …, qn, qn+1, … ,qm][ r11, … r1n]
{Complmts} [ … … … ]
[ … … rnn ]
[0 … 0 ]
We can write A = QR +Q^c*[0]

22
Q

Name some properties of orthogonal matrices.

A
  1. Preserves Inner product
  2. Det(Q) = +-1
    Proof: 1 = Det(I) = Det(Q^TQ) = Det (Q)^2 s.t sqrt(1) = Det(Q)
  3. Inverse of Q is orthogonal since Q^T = inverse of Q
    Note that Q =/= Q^T
23
Q

What is the Definition of a Given’s Matrice?

A

A Givens Matrix is of the form [c -s]
[s c] s.t c^2 +s^2 = 1

24
Q

What is the definition of a Householder Matrix

A

A Householer matrix corresponding to a vector v is defined by Hv = I - 2vv^T/(V^T*V)

25
Q

What is a Regression?

A

Regression refers to the collection of techniques for describingor capturing the dependencies btw different datasets using a certain functional relationship. Regression usually finds the best fit for that functional relationship through parameters. Best fits can be anything really, for example, minimizing th mean squared error which is the Norm L-2, but we could minimize the absolute value.

26
Q

Describe what a Singular Value Decomposition is.

A

The SVD allows us to break down the action of a Matrix A on vectors as a sequence of simpler steps being:
1. Reflection/Rotation in the Domain R^m
2. Scaling
3. Reflection/ Rotation in the image space R^n
Contrary to eigenvalue decomposition, SVD always exists, if A is real, so is its SVD.

27
Q

Describe the 3 matrices of the SVD decomposition

A

The middle matrice Sigma is a diagonal matrix of ordered singular values. V is a matrix of right singular vectors and U is a matrix of left singular vecotrs, both orthogonal matrices.

We can represent the singular Value Decomposition in this way:

A = Sum(sigma_iui*v_i^T). And so A, seen as a linear transformation, is a regroupement of of weighted sums of simpler linear transformations u_i(v_i^T). So the singular value associated with its corresponding left and right singular vector gives the importance of those within the whole transformation.

28
Q

What is de definition of a Matrix norm?

A

The induced matrix norm on A is ||A|| = max ||Ax||/||x||
or
when ||x|| = 1, max ||Ax||
In other words, it is the maximum stretch/scaling that a Matrix A can apply on a vector x.

29
Q

What is the Definition of an Eigenvalue?

A

An eigenvalue is a scalar “a” s.t there exists a non-zero vector called an eigenvector satisfying Av=av. In other words, an eigenpair of A is a pair of a scalar and a vector where the effect of A on the eigenvector can be simplified by a simple scaling by the eigenvalue.

An eigenbaiss is a basis where the action of A on any vector within its span can be interpreted as a linear combination of the basis and the eigenvalue.

30
Q

What is the Definition of an Characteristic Polynomial.

A

The characteristic polynomial of A is defined as det(A-I*“lambda”), whose roots are the eigenvalues of A.

31
Q

What is the definition of Algebraic Multiplicity?

A

The algebraic multiplicity of an eigenvalue is the value of its corresponding exponent in the characteristic polynomial.

32
Q

What is the definition of Geometric Multiplicity?

A

For a given eigenvalue of A, the geometric multiplicity is the number of distinct, linear independent vectors corresponding to the eigenvalue. Equivalently, the dimension of the underlying eigenspace.

33
Q

What is the definition of a Defective Matrix?

A

A Matrix is defective if the algebraic multiplicity of at least one eigenvalue is higher than its geometric multiplicity.

34
Q

What is Diagonalization and which amtrix can be diagonalizable?

A

Matrices with a full set of linearly independent eigenvectors are diagonalizable. This means we can Package Av where v is and nxn matrix into a a multiplication of two matrices. One forming the basis of v and a square matrix with diagonals being A’s eigenvalues. I.e Av = vD, or
A = V
D*inv(V)

35
Q

What is the definition of similar matrices?

A

A and B are similar if there exist a matrix X, invertible s.t B= XAinv(X). B and A share multiplicity and characteristic polynomial as well as eigenvalues, but not always eigenvectors.

36
Q

What is the definition of a positive definite matrix?

A

A matrix is positive definite if the eigenvalues are real and positive, or non-negative for semi-definite.
Pos.Def Matrices have equivalent Eigenvalue decomposition and Singular Value Decomposition.

37
Q

What is the mathematical definition of a symmetric matrix?

A

transpose of A = A

38
Q

Describe what is the eigendecomposition of a matrix.

A

Mathematically, A = QVinv(Q).
Q is a square matrix with columns the eigenvectors of A and V is the diagonal matrix with elemets corresponding to its eigenvalues.

39
Q

What is the mathematical definition of a unitary matrix?

A

A matrix A is unitary if A^TA = AA^T=A*inv(A)=I

40
Q

State the existence of full SVD theorem.

A

For A in the Real mxn space,
1. There exist U in R mxn orthogonal
2. Sigma = Diag(σ_1,σ_2, …σ_min(m,n)) s.t σ_1 >= σ_2 >= … >= σ_min(m,n) >= 0
3. V in R mxn is Orthogonal s.t

A = U * Sigma * V^T

Note, A = Sum (from k=1 to min(m,n)) σ_k * u_k * v_k^T

41
Q

What are the compponents of the SVD decomposition that are unique?

A

Singular values are unique, but the SVD is not. However, if Sigma is ordered, thus U and V as well, then the SVD is unique up to a sign.

42
Q

State Schur’s Factorization Theorem for square matrices.

A

For any square matrix A with complex entries, A can be expressed as:
A = Q * U * inv(Q)
Where Q is a unitary matrix and U is upper triangular. Note that U is similar to A, and U’s diagonal entries are its eigenvalues. All square matrices have a schur decomposition, but it is not always unique.

43
Q

State the LU decomposition of a non-singular matrix A.

A

The LU decomposition for a non-singular Matrix A is the multiplication of a Lower Triangular Matrix with an Upper Triangular Matrix s.t
A = LU.

44
Q

State the normal equations method for the least square problem.

A

r=b-Ax s.t A^T * r =0

45
Q

State the QR decomposition method for the least square problem.

A

.

46
Q

State the Power Iteration Method to find the highest eigenvalue.

A

.

47
Q

Describe what the shifted inverse power iteration is and what it is used for.

A

The Power iteration method can only find the largest eigenvalue. The inverse power method finds the smallest one. Say we thought that an eigenvalue was close to 1, then shifting the Matrix A by the identity matrix times the scalar, 1 in our case, would put the smallest eigenvalue close to zero. Using the inverse power iteration, we could find the value and recover the eigenvalue needed with the following formula.

48
Q

State Schur’s Decomposition Theorem.

A

Given any square complex matrix, there exist a Unitary Q and uppertriangular U s.t A=QUQ^h