Abstract
Infrared and visible light image fusion technology is a hot spot in the research of multi-sensor fusion technology in recent years. Existing infrared and visible light fusion technologies need to register before fusion because of using two cameras. However, the application effect of the registration technology has yet to be improved. Hence, a novel integrative multi-spectral sensor device is proposed for infrared and visible light fusion, and by using the beam splitter prism, the coaxial light incident from the same lens is projected to the infrared charge coupled device (CCD) and visible light CCD, respectively. In this paper, the imaging mechanism of the proposed sensor device is studied with the process of the signals acquisition and fusion. The simulation experiment, which involves the entire process of the optic system, signal acquisition, and signal fusion, is constructed based on imaging effect model. Additionally, the quality evaluation index is adopted to analyze the simulation result. The experimental results demonstrate that the proposed sensor device is effective and feasible.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Y. L. Maoult, T. Sentenac, J. J. Orteu, and J. P. Arcens, “Fire detection: a new approach based on a low cost CCD camera in the near infrared,” Process Safety & Environmental Protection, 2007, 85(3): 193–206
B. C. Ko, K. H. Cheong, and J. Y. Nam, “Fire detection based on vision sensor and support vector machines,” Fire Safety Journal, 2009, 44(3): 322–329
H. T. Chen, Y. C. Wu, and C. C Hsu, “Daytime preceding vehicle brake light detection using monocular vision,” IEEE Sensors Journal, 2015, 16(1): 120–131
Y. Li, Y. L. Qiao, and Y. Ruichek, “Multiframe-based high dynamic range monocular vision system for advanced driver assistance systems,” IEEE Sensors Journal, 2015, 15(10): 5433–5441
V. Milanés, D. F. Llorca, J. Villagrá, J. Pérez, C. Fernandez, I. Parra, et al., “Intelligent automatic overtaking system using vision for vehicle detection,” Expert Systems with Applications, 2012, 39(3): 3362–3373
B. Z. Jia, R. Liu, and M. Zhu, “Real-time obstacle detection with motion features using monocular vision,” Visual Computer, 2015, 31(3): 281–293
S. C. Yi, Y. C. Chen, and C. H. Chang, “A lane detection approach based on intelligent vision,” Computers & Electrical Engineering, 2015, 42(C): 23–29.
Y. S. Lee, Y. M. Chan, and L. C. Fu, “Near-infrared-based nighttime pedestrian detection using grouped part models,” IEEE Transactions on Intelligent Transportation Systems, 2015, 16(4): 1929–1940
R. O′Malley, E. Jones, and M. Glavin, “Detection of pedestrians in far-infrared automotive night vision using region-growing and clothing distortion compensation,” Infrared Physics & Technology, 2010, 53(6): 439–449
C. J. Liu, Y. Zhang, K. K. Tan, and H. Y. Yang, “Sensor fusion method for horizon detection from an aircraft in low visibility conditions,” IEEE Transactions on Instrumentation and Measurement, 2014, 63(3): 620–627
Y. Chen, L. Wang, Z. B. Sun, Y. D. Jiang, and G. J. Zhai, “Fusion of color microscopic images based on bidimensional empirical mode decomposition,” Optics Express, 2010, 18(21): 21757–21769
J. F. Zhao, Q. Zhou, Y. T. Chen, H. J. Feng, Z. H. Xu, and Q. Li, “Fusion of visible and infrared images using saliency analysis and detail preserving based image decomposition,” Infrared Physics and Technology, 2013, 56(2): 93–99
R. Shen, I. Cheng, and A. Basu, “Cross-scale coefficient selection for volumetric medical image fusion,” IEEE Transactions on Biomedical Engineering, 2013, 60(4): 1069–1079
X. Z. Bai, F. G. Zhou, and B. D. Xue, “Fusion of infrared and visual images through region extraction by using multi scale center-surround top-hat transform,” Optics Express, 2011, 19(9): 8444–8457
S. G. Kong, J. Heo, F. Boughorbel, Y. Zheng, B. Abidi, A. Koschan, et al., “Multiscale fusion of visible and thermal IR images for illumination-invariant face recognition,” International Journal of Computer Vision, 2007, 71(2): 215–233
D. M. Bulanona, T. F. Burksa, and V. Alchanatis, “Image fusion of visible and thermal images for fruit detection,” Biosystems Engineering, 2009, 103(1): 12–22
D. P. Bavirisetti and R. Dhuli, “Fusion of infrared and visible sensor images based on anisotropic diffusion and karhunen-loeve transform,” IEEE Sensors Journal, 2016, 16(1): 203–209
C. Beyan, A. Yigit, and A. Temizel, “Fusion of thermal- and visible-band video for abandoned object detection,” Journal of Electronic Imaging, 2011, 20(3): 033001-1–033001-13.
T. Alexander and A. H. Maarten, “Portable real-time color night vision,” SPIE, 2008, 69(74): 697402-1–697402-12.
T. Alexander and A. H. Maarten, “Progress in color night vision,” Optical Engineering, 2012, 51(1): 010901-1–010901-19.
A. Toet, M. A. Hogervorst, R. V. Son, and J. Dijk, “Augmenting full color fused multiband night vision imagery with synthetic imagery for enhanced situational awareness,” International Journal of Image and Data Fusion, 2011, 2(4): 287–308
N. R. Nelson and P. S. Barry, “Measurement of Hyperion MTF from on-orbit scenes,” in Proceedings of the IEEE International Geoscience and Remote Sensing Symposium (IGARSS ’01), Sydney, Australia, 2001, pp. 2967–2969.
H. Du and K. J. Voss, “Effects of point-spread function on calibration and radiometric accuracy of CCD camera,” Applied Optics, 2004, 43(3): 665–670
F. Bu, “Study on modeling and simulation of optical remote sensing system and image processing technology,” Ph.D. dissertation, The University of Chinese Academy of Sciences, Beijing, China, 2014.
B. Ding, “Hyperspectral imaging system model implementation and analysis,” Ph.D. dissereation, Chester F. Carlson Center for Imaging Science Rochester Institute of Technology, New York, the United States, 2014.
Acknowledgment
This study is supported by the Natural Science Foundation of China (Grant No. 51274150) and Shanxi Province Natural Science Foundation of China (Grant No. 201601D011059).
Author information
Authors and Affiliations
Corresponding author
Additional information
This article is published with open access at Springerlink.com
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Qiao, T., Chen, L., Pang, Y. et al. Integrative Multi-Spectral Sensor Device for Far-Infrared and Visible Light Fusion. Photonic Sens 8, 134–145 (2018). https://doi.org/10.1007/s13320-018-0401-4
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13320-018-0401-4