Committee Chair

Kaplanoglu, Erkan

Committee Member

Nasab, Ahad; Liang, Yu; Erdemir, Gokhan

Department

Dept. of Computational Science

College

College of Engineering and Computer Science

Publisher

University of Tennessee at Chattanooga

Place of Publication

Chattanooga (Tenn.)

Abstract

Enhancing the control of prosthetic hands is a crucial challenge that directly impacts the daily functionality of individuals with limb loss. Our research delves into advanced machine learning (ML) methodologies to accurately predict hand-grasping orientations, thereby improving the precision of prosthetic control. We present a comprehensive framework that amalgamates inputs from multiple sensors. This includes electroencephalography (EEG) to discern user intentions, electromyography (EMG) to evaluate muscle activity, and inertial measurement units (IMU) to track features of hand movement. By harmoniously integrating these varied data streams, our ML model strives to provide predictions of hand orientation that are more accurate and intuitive than those achievable with single-sensor systems. This innovative approach has the potential to significantly elevate prosthetic functionality and user experience, enabling more precise and effortless execution of grasping tasks. Integrating these sensors within a singular, cohesive ML framework allows for the dynamic assessment of various physical and neurological cues. This methodological synergy enhances the prosthetic’s adaptability to each user’s unique movement patterns and neural commands. Applying deep learning techniques, particularly through a combination of ML models, we proposed a new model called AutoMerNet. This model further enhances our system’s ability to learn from complex, multi-modal sensor data, continually improving its predictive capabilities over time. This research contributes to the technological advancement of prosthetic hands and opens avenues for personalized prosthetic adjustments based on individual physiological and biomechanical characteristics. The enhanced control provided by our ML model holds promise for significantly improving the quality of life for prosthetic users, facilitating more natural and effective interaction with their environment.

Acknowledgments

I express my deepest gratitude to my esteemed advisor, Dr. Erkan Kaplanuglu, whose invaluable guidance, unwavering support, and insightful feedback have been pivotal throughout this project. This achievement would not have been possible without his invaluable insights and steadfast encouragement. I extend my heartfelt thanks to Dr. Ahad Nasab, Dean of the College of Engineering at the University of Tennessee at Chattanooga, for his unwavering support and encouragement. His visionary leadership and commitment to fostering an enriching research environment have significantly contributed to my academic and professional development. Additionally, I am profoundly grateful to my distinguished committee members, Dr. Yu Liang and Dr. Gokhan Erdemir, for their exceptional guidance and support. I am also deeply thankful to my spouse, Amin, for his unwavering support and understanding, which have been a constant source of strength and motivation throughout this journey. Furthermore, I am immensely grateful to my parents for their endless love, encouragement, and belief in my abilities, which have been the foundation of all my accomplishments.

Degree

Ph. D.; A dissertation submitted to the faculty of the University of Tennessee at Chattanooga in partial fulfillment of the requirements of the degree of Doctor of Philosophy.

Date

8-2024

Subject

Artificial hands; Machine learning; Deep learning (Machine learning)

Keyword

EEG, EMG, IMU, Hand Grasping Orientation, Machine Learning, Controlling Prosthetics Hand

Document Type

Doctoral dissertations

DCMI Type

Text

Extent

xiii, 90 leaves

Language

English

Rights

http://rightsstatements.org/vocab/InC/1.0/

License

http://creativecommons.org/licenses/by/4.0/

Share

COinS