Lidar Flashcards
LiDAR stands for?
Also known as?
Light Detection And Ranging
- Aerial Laser Scanning (ALS)
- Laser Altimetry
What are the 3 main units of LiDAR System components?
- Laser and deflection unit (Scanning mechanism)
- Ranging Unit (Recorder)
- DGPS and INS (positioning)
What is LiDAR, basic how it works
- Active instrument, like radar
- Transmits laser pulses and receives returned laser signal
- Measures distance to target via travel time of signal
What is LiDAR, when developed
- Advent of GPS and INS platform positioning technology enabled accurate geo-location of return signal and development of LiDAR beginning in early 1990’s
INS
- Inertial Navigation System
- Accelerometers and gyroscopes are used to track position and orientation of device
- Changes in velocity and orientation of remote sensing platform are detected
What data does lidar produce?
- Positional x,y
- Elevation z
- Intensity sometimes used
What is used as a lidar platform?
- Aerial platforms
- Lidar in space technology experiment (LITE) in 1994 and Shuttle Laser Altimeter 1 and 2 (SLA-01 and 02) missions in 1996 and 1997
LITE?
Lidar In Space Technology Experiment (1994)
- 1st highly detailed view of vertical structure of cloud and aerosol from surface through middle atm (new application discovered, not just for ground)
LITE Goals
- Mostly to prove tech and use for shuttles
- Goals: Validate/explore key lidar tech for space borne applications, gain operational experience to develop future systems on satellite platforms
LITE Mission
- Mission: Operated 53 hours, collected 40GB for 1.4million km of ground
- Instruments on top of shuttle, flipped shuttle upside down to point towards Earth surface to get data
Shuttle Laser Altimeter Earth Science Applications
- Oceanography, wave states
- Hazards, coastal erosion
- Geomorphology, drainage evolution
- Geodynamics, regional tilts
- Hydrology, lake levels
- Seismicity, fault scarps
- Volcanology, eruption volumes
- Ecology, tree height
- Climatology, cloud top heights
- Tectonics, mountain relief
- Glaciology, glacier dynamics
ICESat
- NASA launched mission in 2003 to understand atm and climate change on polar masses
- The Ice, Clouds and Land Elevation Satellite
- Measure ice sheet elevation and change over time, height profiles of clouds and aerosols, land elevations and veg cover, approx. sea ice thickness
- Ended in 2010
Applications of lidar
- Digital terrain modelling (DTM)
- faults and uplift
- forestry
- oceanography
- natural hazards, floods
- man made structure mapping
- oil and gas exploration
- natural resource management
- mapping of linear structures
- glacier (ice sheet) movement
- atmosphere
- often combined with other sources to improve estimation and classification
Laser light forms basis of lidar, pulses of light are?
- beamed towards target (Earth) several times per second
- Reflected light returns to sensor is measured
- Lasers focused, coherent beams of light energy w/ little divergence
Single returns are recorded when?
- Pulse strikes solid object like building or rock
Multiple returns are recoded when?
- Pulse strikes vegetation canopy, and some light travels past canopy top and returns come from leaves, stems, trunks and underlying ground
Pulse Repetition Frequency, prf
- Number of pulses per second emitted by a lidar
- Advent 1990’s = 2000-25000
- 2000’s = 50000+ w/ TB of data
- Current = 250000+
Lidar point cloud
- Data before processing like classification
- 3D point cloud of single and multiple returns
- Used to create DEM (DSM and DTM) or infer property of target
DEM
- Digital Elevation Model
- File or database containing elevation points over a contiguous area
- Subdivided into DSM and DTM
DSM
- Digital Surface Models
- Contain elevation inför about all features in the landscape, such as vegetation, buildings, and other structures
DTM
- Digital Terrain Models
- Elevation info about bare-Earth surface w/o presence of veg or man-made structures
What are techniques for creating DEM’s?
- In situ surveying (costly and time consuming)
- Interferometric SAR (InSAR) (High res from space but veg and steep topo lead to error)
- Photogrammetry (accurate, timely, relatively affordable but inferred and less accurate than Lidar)
Discrete Lidar system
- records x,y,z and intensity
- z data from ‘pulse ranging principle
- intensity from amplitude of returned signal
Pulse ranging principle
- Distance/range determined by the timing of pulses from and to the Lidar
- Range = speed of light x (time returned - time emitted)/2
Early discrete lidar systems
- Only captured single returns
- Year 2000 captured 3-5 returns per pulse
What is the result of more returns per pulse?
- More returns increases size of dataset
- Most missions use 3 returns to balance detail with data size
Full waveform Lidar system
- Records entire waveform of return laser pulse as function of time
- Mainly research purposes (veg density, wildlife habitat mapping,) b/c data volume very high and processing difficult
What are the 2 types of Lidar sensors? How are they the same/different?
- Profiling
- Imaging
- Measurement same for both types
- Differ in how swath is collected
Profiling Lidar
- Pulses aimed directly beneath platform at NADIR
- Like single beam
- Echo profile along flight path of sensor
- Can get canopy height
How can canopy height be determined from Lidar?
- Canopy topography, 1st return canopy, 2nd topography
- Subtract 2nd from 1st to get canopy height
Imaging Lidar
- Scanner directs pulses over a swath beneath platform as it travels
- Extended echo profile beyond flight path of sensor
Imaging Lidar
- Scanner directs pulses over a swath beneath platform as it travels
- Extended echo profile beyond flight path of sensor
- Ground coverage has ‘saw-tooth’
Why does imaging lidar produce a ‘saw-tooth’ pattern?
- Because platform moves forward along flight path as swath moves around
- B/c oscillating rotating mirror
What are the different types of scanning mechanisms for imaging lidar and what swath patterns do they create?
- Oscillating mirror, z-shaped/sinusoidal (saw-tooth)
- Rotating polygon, parallel lines in diagonal
- Nutating mirror/Palmer scan, Elliptical
- Fiber switch, parallel lines (still experimental)
What is the most common scanning mechanism of imaging lidar?
- Oscillating mirror
- ‘Saw-tooth’ pattern
Practical limitations of space borne Lidar means that it is what type of sensor?
- Profiling type (not imaging)
- Returns from directly below platform
Lidar Footprint for Imaging Lidar
- Approx. circular on ground, instantaneous footprint
- Fp = [height of aircraft/(cos2 of scan angle under investigation)] * gamma
- Where gamma = divergence of laser beam (in radians?)
Calculating Lidar footprint for Profiling Lidar?
- Diameter of illuminated area approx. = height above ground * Gamma
- Where gamma is divergence of laser
- Simple b/c eliminate angular b/c looking straight down
What is the divergence of laser beam when calculating Lidar footprint?
- Divergence angle in radians from vertical to edge of beam
What are the typical footprint sizes for airborne Lidar and space spaceborne?
- Airborne = 0.2-0.9m
- Spaceborne = 70m
What is the swath pattern of Lidar determined by?
- Lidar sensor type (profiling, imaging)
- Point Density
What kind of swath pattern does a profiling lidar produce?
- Pulses aimed directly beneath platform at Nadir
What kind of swath pattern does an imaging lidar produce? What happens when a platform increases in height above ground?
- Pulses aimed over a swath beneath platform
- Swath width is function of scan angle and flying height
- Swath = 2height of sensor above groundtan(sensor scan angle/2)
- Fly higher increases swath but decreases resolution
What is the relationship of point density and Lidar swath pattern?
- Refers to spacing of hits along a profile (profiling lidar) or within a swath (imaging)
- Determined by flying height, platform velocity, field of view, prf
What is in a Lidar dataset?
- Irregularly spaced hits in x,y dimension, processed data is smoothed
- 3D point clouds in z dimension, just location, not intensity
What does data processing of Lidar require?
- Most software is proprietary or university product
- Dataset of x,y,z hits can be processed using GIS
- Processing requires user knowledge of target characteristics (can be augmented by optical data)
What are the 3 main data processing steps of Lidar?
- Filtering
- Classification
- Interpolation (and display)
Lidar processing: Filtering
- Removal of unwanted data
- May be only main processing step required
- Removal of partial returns (secondary) and keep first and last returns
- Requires filtering algorithm
First returns used for?
- DSM (surface above ground, i.e. canopy etc.)
Last returns used for?
- DTM
What is the difference between the first and last returns?
- Height distribution model (canopy height etc)
Lidar filtering algorithms, deriving CHM
- Deriving canopy height model (CHM) often of importance for forest resource management as it is indicator of other variables such as timber volume, biomass, carbon sequestration
- Discrete return of Lidar data of forest can be filtered to derive CHM
- Vegetation removal filter
Lidar Processing: Classification goals
- Find a specific structure from data after filtering
- Assigning proper class labels to filtered data
- Achieved by applying rules or statistical models that separate classes
Classification using un-filtered data
- Possible when point cloud data contains info that aids classification process
- Ex. differentiating buildings from canopy of similar height when both are first returns to Lidar, veg will also have partial returns and multiples
Interpolation
- Creates smooth, continuous dataset from discrete objects like Lidar points
- Irregularly spaced hits re-projected to form a regular image-like array
- Different interpolation methods exist, choose one that represents the surface (DEM)
Interpolation methods differ in terms of?
- Ease of use
- Mathematical complexity
- Computational expense
What are the common methods of interpolation algorithms?
- Interpolation rasterizes based on 2 common methods:
- Inverse Distance Weighting (IDW)
- Spline
- Others based on natural neighbours
IDW
- Inverse Distance Weighted
- Weights assigned to known values w/in neighbourhood according to distance away
- Weight based on 1/distance of point squared (i.e. inverse distance)
- Power is additional option
- New value based on sum of weights*points/sum of weights in neighbourhood search radius
What are the benefits and drawbacks of IDW?
- Benefit: useful for locationally dependent variables
- Drawbacks: trends not well accounted for, ‘duck-egg’ pattern
Why are trends not accounted for in IDW?
- Weighted means, spaces btwn points trend towards mean
- Causes pits where the should be peaks and vice versa
What is the ‘duck-egg’ pattern in IDW?
- Solitary points have values that differ greatly from their surroundings
Spline
- Interpolation method
- Smooth curves locally fitted to a set of data points using piece-wise polynomial functions
- Piece-wise uses a few points at a time
- Smoothing around corners, e.g. peaks and valleys in vertical dimension
What are the benefits and drawbacks of Spline?
- Benefit: quick and efficient DEM creation, smooth topography and aesthetic appearance, small scale features retained
- Drawback: uncharacteristically smooth surface
Interpolated data display
- Elevation raster
- Slope (Rate of elevation change in neighbourhood is assigned a colour
- Aspect (Downslope direction assigned a colour)
- Hillshade (shaded relief by assuming a source of illumination giving orientation
Applications of Lidar are dependent on what? Examples?
- Wavelength
- Atmospheric: clouds and aerosols at 532nm
- Bathymetry: water body penetration to approx. 50m at 532nm
- Vegetation and surface: 1064nm
Differential Measurement Lidar? Example?
- Application where Lidar measured at 2 different wavelengths btwn surfaces
- Bathymetry: 1064nm reflects water surface, 532 reflects from ocean floor, Time difference = bottom profile
Differential Absorption Lidar (DIAL)
- Ratio of intensity of return signals at 2 wavelengths, as a function of range, used to infer atm properties
- Ex. aerosol concentration, 532nm partially absorbed by aerosols, 1064nm not absorbed
Lidar and surface elevation change applications
- Thinning of ice sheets
- NASA’s Airborne Topographic Mapper (ATM)
- IceSAT
ATM, wavelength, footprint
- NASA’s Airborne Topographic Mapper
- 5000Hz Swath Lidar at 532nm
- 1m footprint
- Measures small scale topo changes to approx. 10m accuracy
- Flown over Greenland btwn 1997-2003
IceSat, sensor type, wavelength, spacing, goals
- Ice, Cloud and Land Elevation Satellite
- GLAS instrument
- Geoscience Laser Altimeter
- Wavelengths 1064nm and 532nm
- 70m spacing
- Goals: surface elevation data, Cloud properties
Calipso Mission, wavelength, goal
- Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation
- 1064nm and 532nm
- Goals: High resolution vertical profiles of aerosols and clouds
What is necessary to improve Lidar classification?
- Fusion with multi and hyper spectral optical data
- Merged more closely w optical to improve interpretation
- Focus of much research