The color map appears to be the intensity channel. This is inherent to LiDAR sensors, and is a measure of how much of the laser energy is received after measurement. The foliage will have a distinct reflectivity and scattering from the pavement.
Thanks! Yes, we installed an IMU that rotates with the LIDAR. Our state estimation algorithm fuses the information from the IMU to compensate for both the rotation and the carrier’s motion.
good job
This is so cool! I can't wait to read the paper :) Thanks for sharing!
Could you please share the 3d mounts you made for the VLP-16 and mid-360? I’d like to replicate this!
What is the red part at the bottom of the lidar?
Very cool demo, thanks for sharing
Pretty cool stuff!
great work
has the paper released?
Nice work! Can you share some test dataset?
Cool. How does it know where is pavement and where is grass (or other objects) to color them differently?
The color map appears to be the intensity channel. This is inherent to LiDAR sensors, and is a measure of how much of the laser energy is received after measurement. The foliage will have a distinct reflectivity and scattering from the pavement.
cool
wow!! impressive🤩 Does this system require motion compensation for rotation? Is it applied?
Thanks! Yes, we installed an IMU that rotates with the LIDAR. Our state estimation algorithm fuses the information from the IMU to compensate for both the rotation and the carrier’s motion.
@@AnonymousAuthor681Thank you for your kind reply🥰
thanks for the nice work!, is this work is published or the algorithms are going to be published?
Thanks! This paper is under review
Any github can be referenced?
This paper is under review, we'll release the code after publishing.
Creating new Call of Duty map
Machine learning?