Solution of the Problem of Classification of Hydroacoustic Signals Based on Harmonious Wavelets and Machine Learning

Two types of real hydroacoustic signals of whales are classified based on the harmonic wavelet transform (HWT) coefficients (fast implementation), windowed Fourier transform (FT) (spectrogram), and conventional FT using the k-NN algorithm. The accuracy of the classification is estimated for various...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Pattern recognition and image analysis Ročník 30; číslo 3; s. 480 - 488
Hlavní autoři: Klionskii, D. M., Kaplun, D. I., Voznesensky, A. S., Romanov, S. A., Levina, A. B., Bogaevskiy, D. V., Geppener, V. V., Razmochaeva, N. V.
Médium: Journal Article
Jazyk:angličtina
Vydáno: Moscow Pleiades Publishing 01.07.2020
Springer Nature B.V
Témata:
ISSN:1054-6618, 1555-6212
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Two types of real hydroacoustic signals of whales are classified based on the harmonic wavelet transform (HWT) coefficients (fast implementation), windowed Fourier transform (FT) (spectrogram), and conventional FT using the k-NN algorithm. The accuracy of the classification is estimated for various signal-to-noise ratios (SNRs). In order to reduce the dimension of the feature space during classification using the k-NN algorithm, the use of the modulo N reduction method is proposed. The efficiency of the use of harmonic wavelets in the classification of complex nonstationary signals is experimentally proved. The applicability of speech processing methods for the classification of underwater bioacoustic signals is confirmed. The discussed methods are initially developed taking into account the characteristics of human speech, but, nevertheless, showed good results even without being tuned to the characteristics of the classified signals. The problem of classifying two types of whales by the sounds they make using a neural network is solved.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1054-6618
1555-6212
DOI:10.1134/S1054661820030128