Lecture 9 - Complex Sensors Flashcards
Complex sensors
- What do they provide and require?
- Examples
- Provide much more information & require sophisticated processing
- ultrasound, laser, vision, radar
Sonar
- Subtype of..
- What does it use?
- What kind of sensor?
- Subtype of Ultrasounds
- Localizes with echolocation (bats)
- Active sensor (transmit sounds, wait for it to bounce back)
Exact sonar method (3)
-> distance up to..
- Measure time between emission and detection of sounds
- Multiply with speed of sound
- Divide by 2
-> distances up to 10 meters & 30 degrees sound cone
Sonar
Problem
-Specular reflection
Sound waves from emitter bounces of the object surface before it returns to detector
Sonar
When is specular reflection likely?
What is a solution to it?
Smooth surfaces, small angles, reflective surface
Solution: Multiple readings, active sensing
Laser
- How does laser range sensor work?
- Comparison to sonar Resolution
- and Frequency
- Same principle as sonar, but faster (phase-shift measurement)
- Light is emitted in beam, not a cone
- Light is faster -> more measurement possibility
Vision
- Motion vision
- Stereo/binocular vision
- Difference between sequential frames (many tasks)
2. Find matches between stereo images (see depth)
Vision
What is needed for tracking a ball (2)?
- Unique color detection
2. Movement detection
Vision
- Sensor fusion
- How to simplify processing?
- Use vision in combination with other sensors
2. Use knowledge about the environment
Vision Edge detection 1. How are edges detected? 2. Which mathematical method is used? General and precise? 3. What is its purpose?
- An edge (image feature) is where pixels change their brightness
- The second derivative, Laplacian L (convolution)
- Segmentation of the image
Vision
Edge detection
What is a problem?
Solution?
- Noise
2. Gaussian smoothing (smoothing the image)
Image feature SIFT 1. What does it find? 2. Which method is used? -> what should not be where?
- Image points with high contrast
- Difference of Gaussian (DoG) Filter
- > Extrema in DoG imagine should not be on a line
Image feature
SIFT
1. Descriptor (what to do with extrema??)
-> save what where?
- Look at pixels around extrema in DoG and check their gradient (described keypoint)
- > save gradient histograms in database
Image feature SIFT Matching 1. What to compare? 2. When does it recognize location/object?
- Compare new keypoint descriptor with descriptors in database (compute distance, if small -> recognition)
- When enough keypoints are associated with the location are recognized
Motion Detection
- Motion field
- Optical flow
- Motion of actual object
2. Motion as it optically appears (could just be reflection)