Homeostatic Mechanisms in Unsupervised Learning: Enhancing Sparse Coding through Nonlinear Normalization

Gespeichert in:
Bibliographische Detailangaben
Titel: Homeostatic Mechanisms in Unsupervised Learning: Enhancing Sparse Coding through Nonlinear Normalization
Autoren: Hyacinthe Hamon
Quelle: Journal of Artificial Intelligence General science (JAIGS) ISSN:3006-4023. 8:35-43
Verlagsinformationen: Open Knowledge, 2025.
Publikationsjahr: 2025
Beschreibung: Recent advancements in unsupervised learning have illuminated the interplay between machine learning algorithms and biological neural processes, mainly through sparse coding methodologies. This paper explores the significance of homeostatic mechanisms in optimizing unsupervised learning performance. I propose a novel algorithm integrating nonlinear functions to control coefficient selection in sparse coding, fostering a homeostatic balance among competing neurons. By implementing histogram equalization techniques, I demonstrate that adaptive homeostasis enhances coding efficiency and learning speed, surpassing traditional approaches. My findings reveal that effective homeostatic regulation prevents redundancy in neuron selection and promotes a balanced neural network structure, mirroring the dynamics of biological systems. The proposed algorithm’s efficacy is quantitatively validated across various coding and learning scenarios, paving the way for improved real-world applications in convolutional neural networks (CNNs) and beyond.
Publikationsart: Article
ISSN: 3006-4023
DOI: 10.60087/jaigs.v8i1.326
Rights: CC BY
Dokumentencode: edsair.doi...........35b5e52639b63669a508393dbaf7f89d
Datenbank: OpenAIRE
Beschreibung
Abstract:Recent advancements in unsupervised learning have illuminated the interplay between machine learning algorithms and biological neural processes, mainly through sparse coding methodologies. This paper explores the significance of homeostatic mechanisms in optimizing unsupervised learning performance. I propose a novel algorithm integrating nonlinear functions to control coefficient selection in sparse coding, fostering a homeostatic balance among competing neurons. By implementing histogram equalization techniques, I demonstrate that adaptive homeostasis enhances coding efficiency and learning speed, surpassing traditional approaches. My findings reveal that effective homeostatic regulation prevents redundancy in neuron selection and promotes a balanced neural network structure, mirroring the dynamics of biological systems. The proposed algorithm’s efficacy is quantitatively validated across various coding and learning scenarios, paving the way for improved real-world applications in convolutional neural networks (CNNs) and beyond.
ISSN:30064023
DOI:10.60087/jaigs.v8i1.326