Week 5 - Model based Vision Flashcards
What is model based vision
Visual understanding and interpretation are achieved through the use of explicit models of objects and scenes
Wha is Principle Component Analysis
Used for dimensionality reduction
Aims to find the directions (or principal components) that capture the maximum variance in the data
What is dimensionality reduction
(2D → 1D)
Given some data
Find a line of best fit v1 which contains the largest variation in values
If v2 (perpendicular) variation is much smaller compared to v1
We: project all points onto v1
approximately reduce the data by only saving v1 (each point is no represented by a single value)
Can also do: 3D → 2D, 4D → 3D
When does dimensionality reduction create a better representation
When the ellipse along v1 is narrower
What is the PCA Algorithm
1) Assemble data into matrix
2) Compute the covariance matrix
3) Find the Eigenvalues (λi) and Eigenvectors (vi) of C
4) Choose the K largest eigenvalues to account for p% of T
(because we want to reduce the number of dimensions)
For example, we might choose p = 0.95
What are the dimensions of the intial PCA matrix
number of samples x number of variables
row x column
What is an Eigenvalue (λi)
Represents the magnitude of variance along each direction
What is an Eigenvector (vi)
The directions of maximum variance in the dataset
How do you calculate total variance along a direction
Σ λi
What are the dimensions of the PCA covariance matrix
number of var x number of var
(symmetrical)
what is in each segment of a covariance matrix, given 2 var:x,y
cov(x,x) cov(x,y)
cov(y,x) cov(y,y)
What is cov(x,x) the same as
var(x)
How do you calculate cov(x,y)
cov(x,y) = E[xy] - E[x]E[y]
E[] = average
What equation to eigen values and vectors satisfy
Av = λv
To find the eigenvalues we solve
det( A - λI) = 0
where I is an identity matrix
what is the formula for determinant
determinant = ad - bc