Statistical Complexity as a Criterion for the Useful Signal Detection Problem

Cover Page

Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription or Fee Access

Abstract

Three variants of the statistical complexity function, which is used as a criterion in the problem of detection of a useful signal in the signal-noise mixture, are considered. The probability distributions maximizing the considered variants of statistical complexity are obtained analytically and conclusions about the efficiency of using one or another variant for detection problem are made. The comparison of considered information characteristics is shown and analytical results are illustrated on an example of synthesized signals. A method is proposed for selecting the threshold of the information criterion, which can be used in decision rule for useful signal detection in the signal-noise mixture. The choice of the threshold depends a priori on the analytically obtained maximum values. As a result, the complexity based on the total variation demonstrates the best ability of useful signal detection.

About the authors

A. A. Galyaev

Trapeznikov Institute of Control Sciences, Russian Academy of Sciences

Email: galaev@ipu.ru
Moscow, Russia

P. V. Lysenko

Trapeznikov Institute of Control Sciences, Russian Academy of Sciences

Email: pavellysen@ipu.ru
Moscow, Russia

L. M. Berlin

Trapeznikov Institute of Control Sciences, Russian Academy of Sciences

Author for correspondence.
Email: berlin.lm@phystech.edu
Moscow, Russia

References

  1. Shannon C.E. A Mathematical Theory of Communication // Bell Syst. Tech. J. 1948. V. 27. P. 379-423.
  2. Gray R.M. Entropy and Information Theory. New York: Springer, 2011. https://doi.org/10.1007/978-1-4419-7970-4
  3. Holub A., Perona P., Burl M.C. Entropy-based Active Learning for Object Recognition // Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), IEEE. 2008. P. 1-8. https://doi.org/10.1109/CVPRW.2008.4563068
  4. Osisanwo F.Y., Akinsola J.E.T., Awodele O. et al. Supervised Machine Learning Algorithms: Classification and Comparison // Int. J. Comput. Trends Technol. (IJCTT). 2017. V. 48. No. 3. P. 128-138. https://doi.org/10.14445/22312803/IJCTT-V48P126
  5. Shen J., Hung J., Lee L. Robust Entropy-based Endpoint Detection for Speech Recognition in Noisy Environments // Proc. 5th International Conference on Spoken Language Processing (ICSLP). 1998. https://doi.org/10.21437/icslp.1998-527
  6. Ribeiro M., Henriques T., Castro L., Souto A., Antunes L., Costa-Santos C., Teixeira A. The Entropy Universe // Entropy. 2021. V. 222. No. 2. art. 222. https://doi.org/10.3390/e23020222
  7. Ramirez J., Segura J.C., Benitez C. et al. A New Kullback-Leibler VAD for Speech Recognition in Noise // IEEE Signal Proc. Lett. 2004. V. 11. No. 2. P. 266-269. https://doi.org/10.1109/LSP.2003.821762
  8. Horie T., Burioka N., Amisaki T., Shimizu E. Sample Entropy in Electrocardiogram During Atrial Fibrillation // Yonago Acta Medica. 2018. V. 61. No. 1. P. 49-57. https://doi.org/10.33160/yam.2018.03.007
  9. Lamberti P.W., Martin M.T., Plastino A., Rosso O.A.Intensive Entropic NonTriviality Measure // Phys. A: Stat. Mech. Appl. 2004. V. 334. No. 1. P. 119-131. https://doi.org/10.1016/j.physa.2003.11.005
  10. Lopez-Ruiz R. Shannon Information, LMC Complexity and Renyi Entropies: A Straightforward Approach // Biophys. Chem. 2005. V. 115. No. 3. P. 215-218. https://doi.org/10.1016/j.bpc.2004.12.035
  11. Zunino L., Soriano M.C., Rosso O.A. Distinguishing Chaotic and Stochastic Dynamics from Time Series by Using a Multiscale Symbolic Approach // Phys. Rev. E. Stat. Nonlin. Soft. Matter Phys. 2012. V. 86. No. 4. P. 1-5. https://doi.org/10.1103/PhysRevE.86.046210
  12. Ronald L.A., Duncan W.M. Signal Analysis: Time, Frequency, Scale, and Structure. N.J.: IEEE Press, 2004.
  13. Ширяев А.Н. Вероятностно-статистические методы в теории принятия решений. М.: МЦНМО: НМУ, 2020.
  14. Kishan G.M., Chilukuri K.M., HuaMing Huang. Anomaly Detection Principles and Algorithms. Cham: Springer. 2017. https://doi.org/10.1007/978-3-319-67526-8
  15. Berlin L.M., Galyaev A.A., Lysenko P.V. Comparison of Information Criteria for Detection of Useful Signals in Noisy Environments // Sensors. 2023. V. 23. No. 4. art. 2133. https://doi.org/10.3390/s23042133
  16. Johnson P., Moriarty J., Peskir G. Detecting Changes in Real-Time Data: A User's Guide to Optimal Detection // Philos. Trans. Royal Soc. A. 2017. V. 375. P. 16. art. 2100. https://doi.org/10.1098/rsta.2016.0298
  17. Li Z., Li Y., Zhang K.A. Feature Extraction Method of Ship-Radiated Noise Based on Fluctuation-Based Dispersion Entropy and Intrinsic Time-Scale Decomposition // Entropy. 2019. V. 21. No. 7. art. 693. https://doi.org/10.3390/e21070693
  18. Sason I. On f-Divergences: Integral Representations, Local Behavior, and Inequalities // Entropy. 2018. V. 20. No. 5. art. 383. https://doi.org/10.3390/e20050383

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2023 The Russian Academy of Sciences