Resource Management Flashcards
What is sensor fusion and how does it reduce uncertainty?
Sensor fusion combines data from multiple sensors to reduce uncertainty by separating noise from the actual signal.
Give an example of everyday sensor fusion and explain how it works
Google Maps combines GPS satellite data with phone sensors (gyroscope, compass, accelerometer) to determine precise location and orientation.
What are common types of sensors used in autonomous systems?
Sensors include cameras (vision), ultrasonic sonar (echolocation), and LiDAR (producing 3D images using laser impulses).
Describe high-level sensor fusion and its purpose
High-level sensor fusion combines processed sensor outputs (features, measurements, decisions) beyond raw data, enhancing decision-making.
How does sensor fusion contribute to accurate object positioning in autonomous systems?
Sensor fusion algorithms combine data from various sensors to determine accurate object positions, considering each sensor’s pros and cons.
What challenges arise when working with sensory data?
Sensors are susceptible to interference, resulting in noisy data. Sensor fusion aims to separate noise from the actual signal.
Explain the difference between low-level and high-level sensor fusion
Low-level fusion uses raw data, while high-level fusion combines processed sensor outputs (features, measurements, decisions).
How can sensor fusion benefit applications like autonomous driving and mobile robots?
Smaller and cheaper sensors allow integration into autonomous systems, improving safety and decision-making.
What role does sensor fusion play in managing resources in data-intensive systems?
Sensor fusion impacts resource allocation based on the level of transformation at which fusion occurs.
What is the significance of LiDAR sensors in autonomous driving?
LiDAR produces accurate 3D images of surroundings using rapid laser impulses, crucial for autonomous vehicles.