Document Type

Article

Publication Title

Machine Learning for Low Signal-to-Noise Ratio Detection

Abstract

Sensor networks collect data that is often contaminated by noise. Therefore, it is often necessary to analyze sensor data to determine if a signal is present. This research project utilizes a machine learning algorithm that is able to detect a signal in the presence of noise. The algorithm incorporates the long short-term memory (LSTM) method to determine the presence or absence of a signal in the midst of white Gaussian noise. This machine learning approach was tested with computer generated data and has an accuracy of at least 98% for signal-to-noise levels greater than -12 dB. Furthermore, this algorithm can detect signals at least 65% accurately for signal-to-noise levels greater than approximately -26 dB. Moreover, the presence of an anomaly in the data doesn’t have a substantial impact on the detection accuracy. As a result, this detection method is very robust and has applications in surveillance and remote sensing.

DOI

http://dx.doi.org/10.2139/ssrn.4543779

Publication Date

8-23-2023

Share

COinS