TY - GEN
T1 - A multi-modal scanning system to digitize CBRNE emergency response scenes
AU - Salathe, Marco
AU - Quiter, Brian J.
AU - Bandstra, Mark S.
AU - Chen, Xin
AU - Negut, Victor
AU - Folsom, Micah
AU - Weber, Gunther H.
AU - Greulich, Christopher
AU - Swinney, Mathew
AU - Prins, Nicholas
AU - Archer, Daniel E.
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - A handheld system developed to digitize a contextual understanding of the scene at a chemical, biological, radiological, nuclear and/or explosives (CBRNE) events is described. The system uses LiDAR and cameras to create a colorized 3D model of the environment, which helps domain experts that are supporting responders in the field. To generate the digitized model, a responder scans any suspicious objects and the surroundings by carrying the system through the scene. The scanning system provides a real-time user interface to inform the user about scanning progress and to indicate any areas that may have been missed either by the LiDAR sensors or the cameras. Currently, the collected data are post-processed on a different device, building a colorized triangular mesh of the encountered scene, with the intention of moving this pipeline to the scanner at a later point. The mesh is sufficiently compressed to be sent over a reduced bandwidth connection to a remote analyst. Furthermore, the system tracks fiducial markers attached to diagnostic equipment that is placed around the suspicious object. The resulting tracking information can be transmitted to remote analysts to further facilitate their supporting efforts. The paper will discuss the system's design, software components, the user interface used for scanning a scene, the necessary procedures for calibration of the sensors, and the processing steps of the resulting data. The discussion will close by evaluating the system's performance on 11 scenes.
AB - A handheld system developed to digitize a contextual understanding of the scene at a chemical, biological, radiological, nuclear and/or explosives (CBRNE) events is described. The system uses LiDAR and cameras to create a colorized 3D model of the environment, which helps domain experts that are supporting responders in the field. To generate the digitized model, a responder scans any suspicious objects and the surroundings by carrying the system through the scene. The scanning system provides a real-time user interface to inform the user about scanning progress and to indicate any areas that may have been missed either by the LiDAR sensors or the cameras. Currently, the collected data are post-processed on a different device, building a colorized triangular mesh of the encountered scene, with the intention of moving this pipeline to the scanner at a later point. The mesh is sufficiently compressed to be sent over a reduced bandwidth connection to a remote analyst. Furthermore, the system tracks fiducial markers attached to diagnostic equipment that is placed around the suspicious object. The resulting tracking information can be transmitted to remote analysts to further facilitate their supporting efforts. The paper will discuss the system's design, software components, the user interface used for scanning a scene, the necessary procedures for calibration of the sensors, and the processing steps of the resulting data. The discussion will close by evaluating the system's performance on 11 scenes.
UR - http://www.scopus.com/inward/record.url?scp=85147538035&partnerID=8YFLogxK
U2 - 10.1109/SSRR56537.2022.10018826
DO - 10.1109/SSRR56537.2022.10018826
M3 - Conference contribution
AN - SCOPUS:85147538035
T3 - SSRR 2022 - IEEE International Symposium on Safety, Security, and Rescue Robotics
SP - 74
EP - 79
BT - SSRR 2022 - IEEE International Symposium on Safety, Security, and Rescue Robotics
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2022 IEEE International Symposium on Safety, Security, and Rescue Robotics, SSRR 2022
Y2 - 8 November 2022 through 10 November 2022
ER -