Enhancing classification robustness: Stability analysis of fuzzy and shannon entropy of image under noisy environment.

Uloženo v:
Podrobná bibliografie
Název: Enhancing classification robustness: Stability analysis of fuzzy and shannon entropy of image under noisy environment.
Autoři: Khan, Fatema, Islam, Md. Imdadul, Jahan, Sarwar
Zdroj: Signal, Image & Video Processing; Dec2025, Vol. 19 Issue 16, p1-9, 9p
Abstrakt: In this paper, fuzzy entropy is evaluated as a robust image feature under noisy conditions. The membership function (MF) parameters are varied in the context of linear widening and narrowing of the width of linguistic values (or fuzzy values) to maximize fuzzy entropy using a graphical solution instead of Evolutionary Computation (EC) or Particle Swarm Optimization (PSO), which are time-consuming and may suffer from convergence issues. The stability of fuzzy and Shannon entropy is analyzed under Additive White Gaussian Noise (AWGN), added to four standard grayscale images (Boat, Onion, Barbara, Baboon) across a wide SNR range (0 dB to –45 dB). Result shows that fuzzy entropy maintains its stability under noise down to –27 dB, significantly outperforming Shannon entropy, which begins to degrade beyond –18 dB. Furthermore, to evaluate classification performance under noise, four fuzzy features are extracted from each image and used to train three machine learning (ML) classifiers: Linear Discriminant Analysis (LDA), Naive Bayes (NB), and Decision Tree (DT), on a rice leaf disease dataset consisting of 300 images across three classes. Their performance is compared with two deep learning (DL) models, CNN and LSTM, trained directly on images, varying the SNR of images. While the DL models achieve higher peak accuracy, the ML classifiers using fuzzy entropy features demonstrate more stable performance under increasing noise. This study highlights the potential of fuzzy entropy as a low-complexity, noise-resilient alternative for image classifications, and identifies a trade-off between stability and accuracy in ML vs. DL approaches. [ABSTRACT FROM AUTHOR]
Copyright of Signal, Image & Video Processing is the property of Springer Nature and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Databáze: Complementary Index
Popis
Abstrakt:In this paper, fuzzy entropy is evaluated as a robust image feature under noisy conditions. The membership function (MF) parameters are varied in the context of linear widening and narrowing of the width of linguistic values (or fuzzy values) to maximize fuzzy entropy using a graphical solution instead of Evolutionary Computation (EC) or Particle Swarm Optimization (PSO), which are time-consuming and may suffer from convergence issues. The stability of fuzzy and Shannon entropy is analyzed under Additive White Gaussian Noise (AWGN), added to four standard grayscale images (Boat, Onion, Barbara, Baboon) across a wide SNR range (0 dB to –45 dB). Result shows that fuzzy entropy maintains its stability under noise down to –27 dB, significantly outperforming Shannon entropy, which begins to degrade beyond –18 dB. Furthermore, to evaluate classification performance under noise, four fuzzy features are extracted from each image and used to train three machine learning (ML) classifiers: Linear Discriminant Analysis (LDA), Naive Bayes (NB), and Decision Tree (DT), on a rice leaf disease dataset consisting of 300 images across three classes. Their performance is compared with two deep learning (DL) models, CNN and LSTM, trained directly on images, varying the SNR of images. While the DL models achieve higher peak accuracy, the ML classifiers using fuzzy entropy features demonstrate more stable performance under increasing noise. This study highlights the potential of fuzzy entropy as a low-complexity, noise-resilient alternative for image classifications, and identifies a trade-off between stability and accuracy in ML vs. DL approaches. [ABSTRACT FROM AUTHOR]
ISSN:18631703
DOI:10.1007/s11760-025-04971-2