Lecture 1 Flashcards
How are images represented in numpy?
3D array: y, x, c
What is the advantage and disadvantage of using floats in image representations?
Many calculations (colour conversion etc) yield decimal numbers so you don’t lose as much accuracy
However, integer operations are faster
What colour space is used in newspaper printing?
CMYK
What colour space is used in NTSC analogue TV?
YIQ
What is YIQ?
Y channel is greyscale
I and Q channels are two colour differences used to reconstruct RGB from signal
What is the difference between a calibrated and uncalibrated representation?
Uncalibrated (e.g. RGB, HSV) just use data directly from the camera
Calibrated requires the entire capture system to be properly calibrated (difficult)
How do you tell if an image is under-, over- or well-exposed from its histogram?
Under: massive peak of pixels towards left
Well: nice distribution of pixels
Over: peak towards right
How does opencv represent monochrome images?
Just 2d array, no colour channels
How does contrast stretching work?
Takes the min and max values of the histogram and scales the entire range to have a min of 0 and max of 255
What is the formula for contrast stretching?
P(x,y) - Pmin
(Vmax - Vmin) ——————- + Vmin
Pmax - Pmin
What is the problem with contrast stretching?
How can this be addressed?
If there is an outlier pixel, e.g. glint of metal or something, it can throw off the whole thing and actually make things worse
Work in 5% from either end of the histogram and use those as the limits
What is histogram equalisation?
Non-linear mapping of grey levels to improve low-contrast regions of an image
How does histogram equalisation work? (basic)
Stretches out regions of similar grey value and compresses regions where few pixels have distinct values
What are the steps in histogram equalisation?
From the histogram of the image, make a cumulative histogram (each element of h will hold the number of pixels with a value of g or lower)
Scan over the image and use the cumulative histogram as a look-up table.
Why is histogram equalisation avoided in serious vision work?
The non-linear mapping changes the pixel distribution (shape of histogram) so makes further processing near impossible