Title

Depth and thermal sensor fusion to enhance 3D thermographic reconstruction

Document Type

Article

Publication Date

4-2-2018

Abstract

Three-dimensional (3D) geometrical models with incorporated surface temperature data provide important information for various applications such as medical imaging, energy auditing, and intelligent robots. In this paper we present a robust method for mobile and real-time 3D thermographic reconstruction through depth and thermal sensor fusion. A multimodal imaging device consisting of a thermal camera and a RGB-D sensor is calibrated geometrically and used for data capturing. Based on the underlying principle that temperature information remains robust against illumination and viewpoint changes, we present a thermal-guided iterative closest point (T-ICP) methodology to facilitate reliable 3D thermal scanning applications. The pose of sensing device is initially estimated using correspondences found through maximizing the thermal consistency between consecutive infrared images. The coarse pose estimate is further refined by finding the motion parameters that minimize a combined geometric and thermographic loss function. Experimental results demonstrate that complimentary information captured by multimodal sensors can be utilized to improve performance of 3D thermographic reconstruction. Through e ective fusion of thermal and depth data, the proposed approach generates more accurate 3D thermal models using significantly less scanning data.

Publication Source (Journal or Book title)

Optics Express

First Page

8179

Last Page

8193

This document is currently not available here.

COinS