recognition Flashcards
explain face recognition eigen faces
initially, the images are subsampled and transformed into a single vector
zero centring the image through substructing the mean
Hence in a single big array, we will obtain the differences from the mean vector, we call it T
compute variance and covariance matrix of T
compute the eigenvalues (which are also weights)
The eigen vectors are ordered according to weight, meaning the eignen vectors with least weight are the ones that do not explain much of the variance
and eigenvectors of the image of T where they are represented
Finally, we perform the encoding through the product of the original image plus the mean with eigenvalues²
and we decode it through the product of the transposed vector
finally through compressing we can then perform the recognition
How do we ususally recognize an object
for a given image , we usually try to segment the image , then extract some sort of feature then extract
what is the patern matching
for a given image window we try to match all over the image through defining the matching criteria
for instance we may use the convolution and make use of the correlation
wht is the lmitation of pattern matchine
explain in a very general way the Eigenfaces
we have a database and we want to recognize faces
The issue here is that the feature space is big , hence with use of the PCA we want to extract all the faces that describe best our data
we make an approximation, we consider a feature space that is composed of the grey level of every pixel n face
one face in the database is simply one point in the feature space, then simply for a gice new image we simply try to compute the distance
hence here we use the PCA to compress:
we have the eigenvectors
what is the downside and advanatages of Eigen faces
The
Explain the nearest neighbours
For a given feature space, we try to classify one unknown case in that feature space, then we detemine the class of that case, such that the fitting relies of the mojority defined by the K neighbourhood
What might be the issue in the nearest neighbours
The distance measurement might be an issue
Explain the decision tree
iterative way of splitting the feature space by space,
initially, the space is splitter feature by feature, and we want to optimise the entropy before and after split
for instance, if we pick the whole space the entropy will be higher when we take a portion where thee a low variance
what is the disadvantage of the decision tree
although it is very fast,
we can rapidly endup with a very large decision tree
, hence we may want to keep the split info, we may also want to add pruning to limit the dept or the level of the decision tree
very sensitive noize such that it is unstable
what is the generalisation
the ability of the model to
explain Bag of visual word
extension of bag of words :
in an image series or points of interest, we extract some sort of a window around the corner
we can then use the clustering to group patches that are similar
then we build some kind of vocabulary by picking the center of the clusters
then for a given image, we extract the patches and and compare the image patches to the dictionary we to later on extract the frequencies which corresponds to the signature of the image
explain the edge histogram
we use some sort of local descriptor , where we compute the histogram of the direction of the gradient