Publisher

University of Tennessee at Chattanooga

Place of Publication

Chattanooga (Tenn.)

Abstract

The research community remains focused on addressing Internet of Things (IoT) security concerns due to its continued proliferation and use of weak or no encryption. Specific Emitter Identification (SEI) has been introduced to combat this security vulnerability. Recently, Deep Learning (DL) has been leveraged to accelerate SEI using the signals’ Time-Frequency (TF) representation. While TF representations improve DL-based SEI accuracy–over raw signal learning–these transforms generate large amounts of data that are computationally expensive to store and process by the DL network. This study investigates the use of entropy-based data reduction applied to “tiles” selected from the signals’ TF representations. Our results show that entropy-based data reduction lowers the average SEI performance by as little as 0.86% while compressing the memory and training time requirements by as much as 92.65% and 80.7%, respectively.

Document Type

posters

Language

English

Rights

http://rightsstatements.org/vocab/InC/1.0/

License

http://creativecommons.org/licenses/by/4.0/

2022_MILCOM__v02_Camera_Ready_.pdf (897 kB)
Original paper presented at MILCOM 2022

Share

COinS
 

An Assessment of Entropy-Based Data Reduction for SEI Within IoT Applications

The research community remains focused on addressing Internet of Things (IoT) security concerns due to its continued proliferation and use of weak or no encryption. Specific Emitter Identification (SEI) has been introduced to combat this security vulnerability. Recently, Deep Learning (DL) has been leveraged to accelerate SEI using the signals’ Time-Frequency (TF) representation. While TF representations improve DL-based SEI accuracy–over raw signal learning–these transforms generate large amounts of data that are computationally expensive to store and process by the DL network. This study investigates the use of entropy-based data reduction applied to “tiles” selected from the signals’ TF representations. Our results show that entropy-based data reduction lowers the average SEI performance by as little as 0.86% while compressing the memory and training time requirements by as much as 92.65% and 80.7%, respectively.