Please use this identifier to cite or link to this item: https://gnanaganga.inflibnet.ac.in:8443/jspui/handle/123456789/2472
Title: A Denoising Framework For 3D And 2D Imaging Techniques Based On Photon Detection Statistics
Authors: Dodda, Vineela Chandra
Kuruguntla, Lakshmi
Elumalai, Karthikeyan
Chinnadurai, Sunil
Sheridan, John T.
Muniraj, Inbarasan
Keywords: Photon Detection Statistics
3D And 2D Imaging Techniques
Photon Counting Imaging
Denoising Framework
Issue Date: 24-Jan-2023
Publisher: Scientific Reports
Abstract: A method to capture three-dimensional (3D) objects image data under extremely low light level conditions, also known as Photon Counting Imaging (PCI), was reported. It is demonstrated that by combining a PCI system with computational integral imaging algorithms, a 3D scene reconstruction and recognition is possible. The resulting reconstructed 3D images often look degraded (due to the limited number of photons detected in a scene) and they, therefore, require the application of superior image restoration techniques to improve object recognition. Recently, Deep Learning (DL) frameworks have been shown to perform well when used for denoising processes. In this paper, for the first time, a fully unsupervised network (i.e., U-Net) is proposed to denoise the photon counted 3D sectional images. In conjunction with classical U-Net architecture, a skip block is used to extract meaningful patterns from the photons counted 3D images. The encoder and decoder blocks in the U-Net are connected with skip blocks in a symmetric manner. It is demonstrated that the proposed DL network performs better, in terms of peak signal-to-noise ratio, in comparison with the classical TV denoising algorithm.
URI: https://doi.org/10.1038/s41598-023-27852-5
http://gnanaganga.inflibnet.ac.in:8080/jspui/handle/123456789/2472
ISSN: 2045-2322
Appears in Collections:Journal Articles

Files in This Item:
File Description SizeFormat 
s41598-023-27852-5.pdf
  Restricted Access
1.83 MBAdobe PDFView/Open Request a copy


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.