Lecture 9 - SVD Flashcards

1
Q

SVD use - Singular Value Decomposition

A

One of the most powerful algorithms. Used in almost all fields of data science:

  • > data analysis, big data, compression
  • > machine learning
  • > gene data analysis
  • > model-order reduction + simulation
  • > least squares solution, matrix pseudo-inverse
  • > condition of a matrix
  • > WEB SEARCHING
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

SVD Basics

A

It’s applicable to both real and complex matrices.

It's a factorization (similarly as LU or Cholesky)
A = U * Σ * Vt
-> A is M x N matrix
-> U is M x N orthogonal matrix 
-> Σ is N x N diagonal matrix 
-> V is  N x N orthogonal matrix 
we assume M > N
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Arithmetic of vectors

A

Vector a is normalized if its length ( norm) is equal to 1!

We can compute the component of any vector in any direction, using the dot product.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Orthogonality

A
  1. A pair of vectors a and b are orthogonal if the dot product aDOTb=0
  2. Two set of vectors X and Y are orthogonal if every xi is orthogonal to every yj
  3. A set of nonzero vectors X is orthogonal, if its elements are piece wise orthogonal
  4. Set of vectors is ORTHONORMAL if it is orthogonal and the norm of each vector from the set is 1.
    Therefore if set X is orthonormal, than Xt * X = I

In other words, vectors from the set X are linearly independet

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Linear (in)dependency

A

Two vectors a and b are linearly dependent if they indicate the same direction. What means that there exist a scalar c, such that => a = c*b

Otherwise, those vectors are linearly independent

More generally:
vectors are linearly dependent if at least one of the vectors in the set can be defined as a linear combination of the others.
Otherwise, vectors are linearly independent.

Conclusion => Inner products can be used to decompose arbitrary vectors into orthogonal components

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Matrix-vector multiplication (mat-vec)

A

A -> M x N matrix
x -> N dimensional vector

Matrix-vector product b = A *x is the M-dimensional column vector.

Different perspective:
vector b is regarded as a linear combination of the columns of A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Hyper ellipse

A

M dimensional generalization of an ellipse. It is the surface obtained by stretching the unit sphere in Rm space by some factors in some orthogonal directions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

SVD

A

????

How well did you know this?
1
Not at all
2
3
4
5
Perfectly