Committee Chair

Kizza, Joseph

Committee Member

Wu, Dalei; Liang, Yu; Kong, Lingju

Department

Dept. of Computer Science and Engineering

College

College of Engineering and Computer Science

Publisher

University of Tennessee at Chattanooga

Place of Publication

Chattanooga (Tenn.)

Abstract

This thesis introduces a novel Unified Algebraic Framework, including an expandable Python Functions Package built upon an extensible 6-Set Discrete Probability Algebra. The motivation behind this research is to provide a unified, general, and extendible quantitative analysis tool that can be used to delve into the neuron-level deep neural network structure and aims at improving the transparency of how the black box works and making advancements in detailed applications. Our approach extends a 6-Set Discrete Probability Algebra to a more systematic quantitative framework that incorporates the analysis of the discrete probability distribution of neurons in deep neural network structure. Our methodology leverages the existing models and visualization of the application of the framework to quantitatively know how this algebra works and then implement the neuron-level application in classical scenarios. The key contribution of this research includes a mathematical 6-Set Discrete Probability Algebra that offers a more robust and reasonable foundation for how neurons play their role in deep learning networks and how the quantitative analysis of the probability distribution of the neurons provides plentiful evidence and knowledge to reduce the intuition and trial and error research pattern in the selection and design of neural networks. The thesis also provides an off-the-shelf expandible quantitative research tool that can be applied in the current domain and customized to expand to various fields. The thesis also demonstrates how the framework defines and measures dissimilarity between neurons to improve diversity in ensemble learning, similarity to achieve neuron-level knowledge transferring, the minimum distance perturbation to optimize the network structure with pruning, and entropy-based on differences of neurons to interpretability and explainability. This research provides a new approach that combines more sophisticated algebraic approaches in AI(Artificial Intelligence) and practical frameworks and tools that can be applied directly in deep learning applications to enhance effectiveness and efficiency. It will be helpful for researchers who are interested in this domain.

Degree

Ph. D.; A dissertation submitted to the faculty of the University of Tennessee at Chattanooga in partial fulfillment of the requirements of the degree of Doctor of Philosophy.

Date

8-2024

Subject

Deep learning (Machine learning); Neural networks (Computer science)

Keyword

Unified Algebraic Framework; 6-Set Discrete Probability Algebra; Neuron-Level Analysis Tool; Unified Algebraic Measure; Deep Learning; Transfer Learning

Document Type

Doctoral dissertations

DCMI Type

Text

Extent

xxvii, 301 leaves

Language

English

Rights

http://rightsstatements.org/vocab/InC/1.0/

License

http://creativecommons.org/licenses/by/4.0/

Share

COinS