LiDAR
We have heard of RADAR which is an object-detection system that uses radio waves to determine the range, angle, or velocity of objects. ... The term RADAR was coined in 1940 by the United States Navy as an acronym for RAdio Detection And Ranging or RAdio Direction And Ranging
The electronic principle on which radar operates is very similar to the principle of sound-wave reflection. If you shout in the direction of a sound-reflecting object (like a rocky canyon or cave), you will hear an echo. If you know the speed of sound in air, you can then estimate the distance and general direction of the object. The time required for an echo to return can be roughly converted to distance if the speed of sound is known.
Similarly we have SONAR, SOund Navigation And Ranging a system for the detection of objects under water by emitting sound pulses and detecting or measuring their return after being reflected.
Although sonar works fantastic for helping your grandfather catch the big ones, it has many other wonderful uses that we generally take for granted. For over one hundred years now, society has benefited from the use of sonar technology in the areas of:
Defence and Military
Pipeline Condition Assessment
Search and Rescue
Underwater Communications
Underwater Mapping
Sonar application is used in power generation, at airports, military bases, educational campus’, municipalities and construction not to mention private water utilities, manufacturing and chemical refining industries.
Someone said that if we can detect that how far the things are then we can also use it in daily practices and safeguard ourselves from the human errors which leads to severe problems.
Another Detection or Direction Ranging is introduced in the technical world !
After Radiation and sound its time for LIGHT !! the system is LiDAR – Light Detection And Ranging.
Just imagine you just wave your magic wand and within a fraction of time you can know how far everything is away from you ? This is how LiDAR works. Obviously without the wand !!
LiDAR is fundamentally a distance technology. From an airplane or helicopter, LiDAR systems actively sends light energy to the ground. This pulse hits the ground and returns to the sensor
Basically, it measures how long it takes for the emitted light to return back to the sensor. In the end, it gets a variable distance to the Earth.
Actually, this is how LiDAR got its name – Light Detection and Ranging.
But let’s dissect LiDAR a little more. For example, what does a LiDAR system generate? What are LiDAR applications in GIS? Let’s demystify light detection and ranging. Hopefully after reading this, you will go from zero to a LiDAR hero.
LIDAR enables a self-driving car (or any robot) to observe the world with a few special super powers:
Continuous 360 degrees of visibility – Imagine if your human eyes allowed you to see in all directions all of the time
Insanely accurate depth information – Imagine if, instead of guessing, you could always know the precise distance (to an accuracy of ±2cm) of objects in relation to you
If you’ve seen a self-driving car before, you’ve probably seen a LIDAR sensor. It’s typically the bulky box mounted on the roof that spins continuously, as seen below on Uber and Baidu self-driving cars.
One of the most popular LIDAR sensors on the market is the high-powered Velodyne HDL-64E, as seen below mounted on Homer.
How Does LIDAR Work?
How does a sensor that has 360 degree vision and accurate depth information work? Simply put: a LIDAR sensor continually fires off beams of laser light, and then measures how long it takes for the light to return to the sensor.
By firing off millions of beams of light per second, the measurements from the LIDAR sensor enable a visualization of the world that is truly 3D. You can infer the exact measurement of any object around you (up to around ~60m, depending on the sensor)
A Brief History of LIDAR
To understand why there’s so much support behind LIDAR today, it’s important to look at other similar technologies which have similar goals.
Sonar
The original depth-sensing robot was the humble Bat (50 million years old!). A bat (or dolphin, among others) is able to perform some of the same capabilities as LIDAR using echolocation, otherwise known as Sonar (sound navigation and ranging). Instead of measuring light beams like LIDAR, Sonar measures distance using sound waves.
After 50 million years of biological exclusivity, World War 1 advanced the timeline of the first major deployment of man-made Sonar sensors, with the advent of submarine warfare. Sonar works excellently in water, where sound travels far better than light or radio waves (more on that in a second). Sonar sensors are in active use on cars today, primarily in the form of parking sensors. These short-range (~5m) sensors enable a cheap way to know just how far that wall is behind your car. Sonar hasn’t been proven to work at the kinds of ranges a self-driving car demands (60m+).
In this instance, the Bat is the sender/receiver
Radar
Radar (radio direction and ranging), much like Sonar, was another technology developed during an infamous World War (WW2, this time). Instead of using light or sound waves, it instead utilizes radio waves to measure distance. We make use of a lot of Radar (using Delphi sensors) on Homer, and it’s a tried-and-tested method that can accurately detect and track objects as far as 200m away.
Radar has very little in terms of downside. It performs well in extreme weather conditions and is available at an affordable pricepoint. Radar is heavily used not only for detection of objects, but tracking them too (ex: understanding how fast a car is going and in which direction). Radar doesn’t necessarily give you granularity of LIDAR, but Radar and LIDAR are very complimentary, and it’s definitely not either/or.
A radar installed on Homer
LIDAR
LIDAR was born in the 1960s, just after the advent of the laser. During the Apollo 15 mission in 1971, astronauts mapped the surface of the moon, giving the public the first glimpse of what LIDAR could do.
Before LIDAR was even considered for automotive and self-driving use, one of the popular use-cases of LIDAR was archeology. LIDAR provides a ton of value for mapping large-scale swaths of land, and both archeology and agriculture benefitted tremendously from it.
An aerially-captured LIDAR map
It wasn’t until the 2000s when LIDAR was first utilized on cars, where it was made famous by Stanley (and later, Junior) in the 2005 Grand DARPA Challenge.
Traction of LIDAR in Self-Driving Cars
Why did LIDAR take off with self-driving cars? In a word: mapping. LIDAR allows you to generate huge 3D maps (its original application!), which you can then navigate the car or robot predictably within. By using a LIDAR to map and navigate an environment, you can know ahead of time the bounds of a lane, or that there is a stop sign or traffic light 500m ahead. This kind of predictability is exactly what a technology like self-driving cars requires, and has been a big reason for the progress over the last 5 years.
Object Detection
As LIDARs have become higher-resolution and operate at longer ranges, a new use-case has emerged in object detection and tracking. Not only can a LIDAR map enable you to know precisely where you are in the world and help you navigate it, but it can also detect and track obstacles like cars, pedestrians and according to Waymo, football helmets.
Modern LIDAR enables you to differentiate between a person on a bike or a person walking, and even at what speed and which direction they are going in.
A Google cars
The combination of amazing navigation, predictability and high-resolution object tracking has meant that LIDAR is the key sensor in self-driving cars today, and it’s hard to see that domination changing. Unless…
Camera-Powered Cars
There’s a number of startups out there approaching the problem of self-driving cars using purely cameras (and perhaps radar), with no LIDAR in sight. Tesla is the biggest company of the bunch, and Elon Musk has repeatedly pushed the idea that if humans can perceive and navigate the world using just eyes, ears and a brain, then why can’t a car? I’m certain that this approach will achieve amazing results, especially as other talented teams work toward this goal, including Comma and AutoX.
It’s important to note that Tesla has an interesting constraint that may have factored in to their decision: scale. Tesla hopes to ship 500k cars a year very soon, and can’t wait for LIDAR to come down in cost (or be manufactured in volume) tomorrow, it needed to happen yesterday!
Elon Musk says that the LIDAR Google uses in its self-driving car ‘doesn’t make sense in a car…
Tesla CEO Elon Musk held a press conference a couple of days ago to explain the Autopilot features included in the…9to5google.com
The Future of LIDAR
The industry is marching ahead with a real focus on: cost decrease and resolution and range increase.
Cost Decrease
Solid-state LIDAR opens up the potential of sub-$1k powerful LIDAR units, which today can cost as much as $80k a unit. LeddarTech are one of the leaders in this early market.
Solid state, fixed sensors are driven by the idea that you want an embeddable sensor with the smallest size at the lowest possible cost. Naturally, that also means that you have a smaller field of view. Velodyne supports both fixed and surround view sensors. The fixed sensors are miniaturized to be embedded. From a cost standpoint, both contain lenses, lasers and detectors. The lowest cost system is actually via surround view sensors because rotation reuses the lens, lasers and detectors across the field of view, versus using additional sensors each containing individual lenses, lasers and detectors. This reuse is both the most economical, as well as the most powerful, as it reduces the error associated with merging different points of view in real-time — something that really counts when the vehicle is moving at speed.
Resolution and Range Increase
The huge jump in the number of applications for LIDAR has brought with it a flood of talented founders and teams starting companies in the space. Higher resolution output and increased tracking range (200m in some cases) will provide better object recognition and tracking, and are one of the key differentiators in sensors from startups like Luminar.
From ground to air, explore the types of LiDAR systems
1. Profiling LiDAR was the first type of Light Detection and Ranging used in the 1980s for single line features such as power lines. Profiling LiDAR sends out an individual pulse in one line. It measures height along a single transect with a fixed Nadir angle.
2. Small Footprint LiDAR is what we use today. Small-footprint LiDAR scans at about 20 degrees moving backwards and forwards (scan angle). If it goes beyond 20 degrees, the LiDAR instrument may start seeing the sides of trees instead of straight down.
Two types of LIDAR are topographic and bathymetric:
i. Topographic LIDAR maps the land typically using near-infrared light.
ii. Bathymetric LiDAR uses water-penetrating green light to measure seafloor and riverbed elevations.
3. Large Footprint LiDAR uses full waveforms and averages LiDAR returns in 20m footprints. But it’s very difficult to get terrain from large footprint LiDAR because you get a pulse return based on a larger area which could be sloping. There are generally less applications for large footprint LiDAR. Only SLICER (Scanning Lidar Imager of Canopies by Echo Recovery) and LVIS (Laser Vegetation Imaging Sensor), both built by NASA and are experimental.
4. Ground-based LiDAR sits on a tripod and scans the hemisphere. Ground-based LiDAR is good for scanning buildings. It’s used in geology, forestry, heritage preservation and construction applications.
Here are LiDAR applications professionals use right now
Light detection and ranging is being used every day in surveying, forestry, urban planning and more. Here are a couple of LiDAR applications that stand out:
Riparian ecologists use LiDAR to delineate stream orders. With a LiDAR-derived DEM, tributaries become clear. It’s easier to see where they go far superior than standard aerial photography.
Foresters use LiDAR to better understand forest structure and shape of the trees because one light pulse could have multiple returns. As with the case of trees, LiDAR systems can record information starting from the top of the canopy through the canopy all the way to the ground.
If Google’s self-driving car got pulled over by the cops, how would it react? Self-driving cars use Light Detection and Ranging. The first secret behind Google’s self-driving car is LiDAR scanner. It detects pedestrians, cyclists stop signs and other obstacles.
Archaeologists have used LiDAR to find subtle variations in elevation on the ground. It was a bit of a surprise when archaeologists found square patterns on the ground over vegetation. Later, they found these square patterns were ancient buildings and pyramids built by ancient Mayan and Egyptian civilizations.
LiDAR system components: breaking it down
How does a light detection and ranging system work? There are 4 parts of an airborne LiDAR. These 4 parts of a LiDAR system work together to produce highly accurate, usable results:
LiDAR sensors scan the ground from side to side as the plane flies. The sensor is commonly in green or near-infrared bands.
GPS receivers track the altitude and location of the airplane. These variables are important in attaining accurate terrain elevation values.
Inertial measurement units (IMU) tracks the tilt of the airplane as it flies. Elevation calculations use tilt to accurately measure incident angle of the pulse.
Computers (Data Recorders) record all of the height information as the LiDAR scans the surface.
These LiDAR components cohesively make up a Light Detection and Ranging system
Storage of the return: full waveform vs discrete LiDAR
Light detection and ranging return pulses are stored in two ways:
Full waveform
Discrete LiDAR
What are the differences between full waveform and discrete LiDAR systems?
Imagine that in the forest that LiDAR pulse is being hit by branches multiple times. Pulses are coming back as 1st, 2nd, 3rd returns. Then you get a large pulse by the bare ground return.
When you record the data as separate returns, this is called Discrete Return LiDAR. Discrete takes each peak and separates each return.
Light Detection and Ranging is moving towards a full waveform system:
When you record the WHOLE RETURN as one continuous wave, this would be called full-waveform LiDAR. Full waveform data is more complicated. You can simply count the peaks and that makes it discrete.
Image source: Google
The electronic principle on which radar operates is very similar to the principle of sound-wave reflection. If you shout in the direction of a sound-reflecting object (like a rocky canyon or cave), you will hear an echo. If you know the speed of sound in air, you can then estimate the distance and general direction of the object. The time required for an echo to return can be roughly converted to distance if the speed of sound is known.
Similarly we have SONAR, SOund Navigation And Ranging a system for the detection of objects under water by emitting sound pulses and detecting or measuring their return after being reflected.
Although sonar works fantastic for helping your grandfather catch the big ones, it has many other wonderful uses that we generally take for granted. For over one hundred years now, society has benefited from the use of sonar technology in the areas of:
Defence and Military
Pipeline Condition Assessment
Search and Rescue
Underwater Communications
Underwater Mapping
Sonar application is used in power generation, at airports, military bases, educational campus’, municipalities and construction not to mention private water utilities, manufacturing and chemical refining industries.
Someone said that if we can detect that how far the things are then we can also use it in daily practices and safeguard ourselves from the human errors which leads to severe problems.
Another Detection or Direction Ranging is introduced in the technical world !
After Radiation and sound its time for LIGHT !! the system is LiDAR – Light Detection And Ranging.
Just imagine you just wave your magic wand and within a fraction of time you can know how far everything is away from you ? This is how LiDAR works. Obviously without the wand !!
LiDAR is fundamentally a distance technology. From an airplane or helicopter, LiDAR systems actively sends light energy to the ground. This pulse hits the ground and returns to the sensor
Basically, it measures how long it takes for the emitted light to return back to the sensor. In the end, it gets a variable distance to the Earth.
Actually, this is how LiDAR got its name – Light Detection and Ranging.
But let’s dissect LiDAR a little more. For example, what does a LiDAR system generate? What are LiDAR applications in GIS? Let’s demystify light detection and ranging. Hopefully after reading this, you will go from zero to a LiDAR hero.
LIDAR enables a self-driving car (or any robot) to observe the world with a few special super powers:
Continuous 360 degrees of visibility – Imagine if your human eyes allowed you to see in all directions all of the time
Insanely accurate depth information – Imagine if, instead of guessing, you could always know the precise distance (to an accuracy of ±2cm) of objects in relation to you
If you’ve seen a self-driving car before, you’ve probably seen a LIDAR sensor. It’s typically the bulky box mounted on the roof that spins continuously, as seen below on Uber and Baidu self-driving cars.
One of the most popular LIDAR sensors on the market is the high-powered Velodyne HDL-64E, as seen below mounted on Homer.
How Does LIDAR Work?
How does a sensor that has 360 degree vision and accurate depth information work? Simply put: a LIDAR sensor continually fires off beams of laser light, and then measures how long it takes for the light to return to the sensor.
By firing off millions of beams of light per second, the measurements from the LIDAR sensor enable a visualization of the world that is truly 3D. You can infer the exact measurement of any object around you (up to around ~60m, depending on the sensor)
A Brief History of LIDAR
To understand why there’s so much support behind LIDAR today, it’s important to look at other similar technologies which have similar goals.
Sonar
The original depth-sensing robot was the humble Bat (50 million years old!). A bat (or dolphin, among others) is able to perform some of the same capabilities as LIDAR using echolocation, otherwise known as Sonar (sound navigation and ranging). Instead of measuring light beams like LIDAR, Sonar measures distance using sound waves.
After 50 million years of biological exclusivity, World War 1 advanced the timeline of the first major deployment of man-made Sonar sensors, with the advent of submarine warfare. Sonar works excellently in water, where sound travels far better than light or radio waves (more on that in a second). Sonar sensors are in active use on cars today, primarily in the form of parking sensors. These short-range (~5m) sensors enable a cheap way to know just how far that wall is behind your car. Sonar hasn’t been proven to work at the kinds of ranges a self-driving car demands (60m+).
In this instance, the Bat is the sender/receiver
Radar
Radar (radio direction and ranging), much like Sonar, was another technology developed during an infamous World War (WW2, this time). Instead of using light or sound waves, it instead utilizes radio waves to measure distance. We make use of a lot of Radar (using Delphi sensors) on Homer, and it’s a tried-and-tested method that can accurately detect and track objects as far as 200m away.
Radar has very little in terms of downside. It performs well in extreme weather conditions and is available at an affordable pricepoint. Radar is heavily used not only for detection of objects, but tracking them too (ex: understanding how fast a car is going and in which direction). Radar doesn’t necessarily give you granularity of LIDAR, but Radar and LIDAR are very complimentary, and it’s definitely not either/or.
A radar installed on Homer
LIDAR
LIDAR was born in the 1960s, just after the advent of the laser. During the Apollo 15 mission in 1971, astronauts mapped the surface of the moon, giving the public the first glimpse of what LIDAR could do.
Before LIDAR was even considered for automotive and self-driving use, one of the popular use-cases of LIDAR was archeology. LIDAR provides a ton of value for mapping large-scale swaths of land, and both archeology and agriculture benefitted tremendously from it.
An aerially-captured LIDAR map
It wasn’t until the 2000s when LIDAR was first utilized on cars, where it was made famous by Stanley (and later, Junior) in the 2005 Grand DARPA Challenge.
Traction of LIDAR in Self-Driving Cars
Why did LIDAR take off with self-driving cars? In a word: mapping. LIDAR allows you to generate huge 3D maps (its original application!), which you can then navigate the car or robot predictably within. By using a LIDAR to map and navigate an environment, you can know ahead of time the bounds of a lane, or that there is a stop sign or traffic light 500m ahead. This kind of predictability is exactly what a technology like self-driving cars requires, and has been a big reason for the progress over the last 5 years.
Object Detection
As LIDARs have become higher-resolution and operate at longer ranges, a new use-case has emerged in object detection and tracking. Not only can a LIDAR map enable you to know precisely where you are in the world and help you navigate it, but it can also detect and track obstacles like cars, pedestrians and according to Waymo, football helmets.
Modern LIDAR enables you to differentiate between a person on a bike or a person walking, and even at what speed and which direction they are going in.
A Google cars
The combination of amazing navigation, predictability and high-resolution object tracking has meant that LIDAR is the key sensor in self-driving cars today, and it’s hard to see that domination changing. Unless…
Camera-Powered Cars
There’s a number of startups out there approaching the problem of self-driving cars using purely cameras (and perhaps radar), with no LIDAR in sight. Tesla is the biggest company of the bunch, and Elon Musk has repeatedly pushed the idea that if humans can perceive and navigate the world using just eyes, ears and a brain, then why can’t a car? I’m certain that this approach will achieve amazing results, especially as other talented teams work toward this goal, including Comma and AutoX.
It’s important to note that Tesla has an interesting constraint that may have factored in to their decision: scale. Tesla hopes to ship 500k cars a year very soon, and can’t wait for LIDAR to come down in cost (or be manufactured in volume) tomorrow, it needed to happen yesterday!
Elon Musk says that the LIDAR Google uses in its self-driving car ‘doesn’t make sense in a car…
Tesla CEO Elon Musk held a press conference a couple of days ago to explain the Autopilot features included in the…9to5google.com
The Future of LIDAR
The industry is marching ahead with a real focus on: cost decrease and resolution and range increase.
Cost Decrease
Solid-state LIDAR opens up the potential of sub-$1k powerful LIDAR units, which today can cost as much as $80k a unit. LeddarTech are one of the leaders in this early market.
Solid state, fixed sensors are driven by the idea that you want an embeddable sensor with the smallest size at the lowest possible cost. Naturally, that also means that you have a smaller field of view. Velodyne supports both fixed and surround view sensors. The fixed sensors are miniaturized to be embedded. From a cost standpoint, both contain lenses, lasers and detectors. The lowest cost system is actually via surround view sensors because rotation reuses the lens, lasers and detectors across the field of view, versus using additional sensors each containing individual lenses, lasers and detectors. This reuse is both the most economical, as well as the most powerful, as it reduces the error associated with merging different points of view in real-time — something that really counts when the vehicle is moving at speed.
Resolution and Range Increase
The huge jump in the number of applications for LIDAR has brought with it a flood of talented founders and teams starting companies in the space. Higher resolution output and increased tracking range (200m in some cases) will provide better object recognition and tracking, and are one of the key differentiators in sensors from startups like Luminar.
From ground to air, explore the types of LiDAR systems
1. Profiling LiDAR was the first type of Light Detection and Ranging used in the 1980s for single line features such as power lines. Profiling LiDAR sends out an individual pulse in one line. It measures height along a single transect with a fixed Nadir angle.
2. Small Footprint LiDAR is what we use today. Small-footprint LiDAR scans at about 20 degrees moving backwards and forwards (scan angle). If it goes beyond 20 degrees, the LiDAR instrument may start seeing the sides of trees instead of straight down.
Two types of LIDAR are topographic and bathymetric:
i. Topographic LIDAR maps the land typically using near-infrared light.
ii. Bathymetric LiDAR uses water-penetrating green light to measure seafloor and riverbed elevations.
3. Large Footprint LiDAR uses full waveforms and averages LiDAR returns in 20m footprints. But it’s very difficult to get terrain from large footprint LiDAR because you get a pulse return based on a larger area which could be sloping. There are generally less applications for large footprint LiDAR. Only SLICER (Scanning Lidar Imager of Canopies by Echo Recovery) and LVIS (Laser Vegetation Imaging Sensor), both built by NASA and are experimental.
4. Ground-based LiDAR sits on a tripod and scans the hemisphere. Ground-based LiDAR is good for scanning buildings. It’s used in geology, forestry, heritage preservation and construction applications.
Here are LiDAR applications professionals use right now
Light detection and ranging is being used every day in surveying, forestry, urban planning and more. Here are a couple of LiDAR applications that stand out:
Riparian ecologists use LiDAR to delineate stream orders. With a LiDAR-derived DEM, tributaries become clear. It’s easier to see where they go far superior than standard aerial photography.
Foresters use LiDAR to better understand forest structure and shape of the trees because one light pulse could have multiple returns. As with the case of trees, LiDAR systems can record information starting from the top of the canopy through the canopy all the way to the ground.
If Google’s self-driving car got pulled over by the cops, how would it react? Self-driving cars use Light Detection and Ranging. The first secret behind Google’s self-driving car is LiDAR scanner. It detects pedestrians, cyclists stop signs and other obstacles.
Archaeologists have used LiDAR to find subtle variations in elevation on the ground. It was a bit of a surprise when archaeologists found square patterns on the ground over vegetation. Later, they found these square patterns were ancient buildings and pyramids built by ancient Mayan and Egyptian civilizations.
LiDAR system components: breaking it down
How does a light detection and ranging system work? There are 4 parts of an airborne LiDAR. These 4 parts of a LiDAR system work together to produce highly accurate, usable results:
LiDAR sensors scan the ground from side to side as the plane flies. The sensor is commonly in green or near-infrared bands.
GPS receivers track the altitude and location of the airplane. These variables are important in attaining accurate terrain elevation values.
Inertial measurement units (IMU) tracks the tilt of the airplane as it flies. Elevation calculations use tilt to accurately measure incident angle of the pulse.
Computers (Data Recorders) record all of the height information as the LiDAR scans the surface.
These LiDAR components cohesively make up a Light Detection and Ranging system
Storage of the return: full waveform vs discrete LiDAR
Light detection and ranging return pulses are stored in two ways:
Full waveform
Discrete LiDAR
What are the differences between full waveform and discrete LiDAR systems?
Imagine that in the forest that LiDAR pulse is being hit by branches multiple times. Pulses are coming back as 1st, 2nd, 3rd returns. Then you get a large pulse by the bare ground return.
When you record the data as separate returns, this is called Discrete Return LiDAR. Discrete takes each peak and separates each return.
Light Detection and Ranging is moving towards a full waveform system:
When you record the WHOLE RETURN as one continuous wave, this would be called full-waveform LiDAR. Full waveform data is more complicated. You can simply count the peaks and that makes it discrete.
Image source: Google
Comments
Post a Comment