LiDAR vs. RADAR
The difference between LiDAR (Light Detection And Ranging) and RADAR (Radio Detection And Ranging) is their wavelength.
Although the basic purpose of LiDAR and RADAR is the same – detecting the presence and volume of distant objects – it is essential to understand the difference between these two technologies.
Light Detection and Ranging (LiDAR) is a light-based remote sensing technology. In the case of Yellowscan, the idea behind LiDAR is quite simple: point a small infrared laser beam at a surface and measure the time it takes for the laser to return to its source. By having a LiDAR with a 360° viewing angle (using a rotating mirror for example), it is possible to obtain a point cloud of the environment. Then, a specific software makes a 3D image that reproduces the shape around the LiDAR with a precise position in space.
The RADAR system works in much the same way as LiDAR, with the big difference that it uses radio waves instead of laser or LED light. It transmits radio waves from a rotating or fixed antenna and measure the time of flight of the reflected signal.
The wavelength of RADAR is between 30 cm and 3 mm, while LiDAR has a micrometer range wavelength (Yellowscan LiDARs work at 903 and 905 nm).
So what difference does it make?
With its wavelength, the RADAR can detect objects at long distance and through fog or clouds. But its lateral resolution is limited by the size of the antenna. The resolution of standard RADAR is several meters at a distance of 100 meters.
LiDAR is a compact solution that enables a high level of accuracy for 3D mapping. At a distance of 100 meters, Yellowscan LiDAR systems have a resolution of a few centimeters.
This is why LiDAR is used for laser altimetry and contour mapping. Radar, on the other hand, is used for aircraft anti-collision systems, air traffic control or radar astronomy.
Data from a scanning radar. The top image is a video of the scene, and the bottom one is the radar data, with corresponding locations marked.
The brightness indicates the strength of return. Car A is close and in the center of the radar return (the video image does not extend as far to the right as the radar); B is further and left; C is a bit further and is barely visible above the roof of A; D is much further and has a bearing between A and B.
Data from Yellowscan UAS LiDAR. The image is a pointcloud generated with Yellowscan Vx-20 in Japan, from a 80m AGL flight.
The top image is colored by height with a shading effect (Eye Dome Lighting). Objects as fine as distribution wires are clearly identified.
The bottom image is colored by the RGB value taken from a camera synchronized to the LiDAR. Each LiDAR point is assigned to a color value taken from the orthomosaic. The rendering is an immersive 3D image.
N.B: Author Léa Moussi. The content of this article is protected by copyright.