Committee Chair

Wu, Dalei; Liang, Yu

Committee Member

Sartipi, Mina; Gao, Lan; Huston, D. (Dryver R.), 1958-

Department

Dept. of Computer Science and Engineering

College

College of Engineering and Computer Science

Publisher

University of Tennessee at Chattanooga

Place of Publication

Chattanooga (Tenn.)

Abstract

Ground penetrating radars (GPRs) have been extensively used in many industrial applications, such as coal mining, structural health monitoring, subsurface utilities detection and localization, and autonomous driving. Most of the existing GPR systems are human-operated due to the need for experience in operation configurations based on the interpretation of collected GPR data. To achieve the best subsurface sensing performance, it is desired to design an autonomous GPR system that can operate adaptively under varying sensing conditions. In this research, first, a generic architecture for cognitive GPRs based on edge computing is studied. The operation of cognitive GPRs under this architecture is formulated as a sequential decision process. Then a cognitive GPR based on 2D B-Scan image analysis and deep Q-learning network (DQN) is investigated. A novel entropy-based reward function is designed for the DQN model by using the results of subsurface object detection (via the region of interest identification) and recognition (via classification). Furthermore, to acquire a global view of subsurface objects with complex shape configurations, 2D B-Scan image analysis is extended to 3D GPR data analysis termed “Scan Cloud.” A scan cloud-enabled cognitive GPR is studied based on an advanced deep reinforcement learning method called deep deterministic policy gradient (DDPG) with a new reward function derived from 3D GPR data. The proposed methods are evaluated using GPR modeling and simulation software called GprMax. Simulation results show that our proposed cognitive GPRs outperform other GPR systems in terms of detection accuracy, operating time, and object reconstruction.

Acknowledgments

To begin, I would like to acknowledge my beautiful wife, Evelyn Omwenga and our adorable children Israel K. Omwenga, Priscilla P. Nyaboke, and Brooklyn G. Gesare. You were here for all of the ups and downs. I could not have made it this far without my sweethearts by my side. Furthermore, I would like to thank my advisor Dr. Wu, co-advisor Dr. Liang, and everyone in the networked intelligence lab for giving me the opportunity to conduct research in an academic environment and facilitate my growth as a researcher. In addition, I would like to thank Dakila Ledesma, Grace Nansamba, Evelyn Namugwanya, Dr. Amani Altarawneh, Mehran Gafhari, Sai Medury, and the rest of the Ph.D. students. From the very beginning, they challenged me, which enabled me to grow in the field of machine learning. Without them, I would not have been able to undertake a research project as challenging as this. I also would like to include a special thank you to my thesis committee. The road to defense has been rocky, but we finally made it. To my dad, Meshack N. Omwenga, thank you for speaking a blessing on me 21 years ago. You prayed that I become a doctor by 32 years old. I am excited at 39 years young I made it. Dr. Joseph Kizza and Dr. Immaculate Kizza, thank you for the continuous encouragement and sound wisdom. Finally, to my family and friends, thank you for the prayers, the earnest presence, and for loving me through it all. I love you all. This work was supported by the National Science Foundation under grant numbers 1647175 and 1924278.

Degree

Ph. D.; A dissertation submitted to the faculty of the University of Tennessee at Chattanooga in partial fulfillment of the requirements of the degree of Doctor of Philosophy.

Date

8-2021

Subject

Edge computing; Ground penetrating radar; Machine learning

Keyword

Autonomous cognitive GPR; deep reinforcement learning; edge computing; subsurface object detection and recognition; scan cloud; 3D reconstruction

Document Type

Doctoral dissertations

DCMI Type

Text

Extent

xvii, 100 leaves

Language

English

Rights

http://rightsstatements.org/vocab/InC/1.0/

License

http://creativecommons.org/licenses/by-nc-nd/3.0/

Share

COinS