This article presents improvements for LIDAR-based systems.
LIDAR-based systems measure the time-of-flight of a laser source onto the scene and back to the sensor, building a wide field of view 3D raster image, but as a scanning process, there are problems associated with motion inside the scene over the duration of the scan. By illuminating the entire scene simultaneously using a broad laser pulse, a 2D camera equipped with a high-speed shutter can measure the time-of-flight over the entire field of view (FOV), thereby, recording an instantaneous snapshot of the entire scene; however, spreading the laser reduces the range. So, what is required is a programmable system that can track multiple regions of interest by varying the field of regard to (1) a single direction, (2) the entire FOV, or (3) intermediate views of interest as required by the evolving scene environment. This was the objective of this project.
Downloads
Similar Publications
- Just Science Podcast: Just Mass Disaster Emergency Response in Maui, Hawaii
- Urban Black Adolescents' Victimization Experiences: The Moderating Role of Family Factors on Internalizing and Academic Outcomes
- Lessons Learned Implementing Gunshot Detection Technology: Results of a Process Evaluation in Three Major Cities