Homeostatic Mechanisms in Unsupervised Learning: Enhancing Sparse Coding through Nonlinear Normalization

Uloženo v:
Podrobná bibliografie
Název: Homeostatic Mechanisms in Unsupervised Learning: Enhancing Sparse Coding through Nonlinear Normalization
Autoři: Hyacinthe Hamon
Zdroj: Journal of Artificial Intelligence General science (JAIGS) ISSN:3006-4023. 8:35-43
Informace o vydavateli: Open Knowledge, 2025.
Rok vydání: 2025
Popis: Recent advancements in unsupervised learning have illuminated the interplay between machine learning algorithms and biological neural processes, mainly through sparse coding methodologies. This paper explores the significance of homeostatic mechanisms in optimizing unsupervised learning performance. I propose a novel algorithm integrating nonlinear functions to control coefficient selection in sparse coding, fostering a homeostatic balance among competing neurons. By implementing histogram equalization techniques, I demonstrate that adaptive homeostasis enhances coding efficiency and learning speed, surpassing traditional approaches. My findings reveal that effective homeostatic regulation prevents redundancy in neuron selection and promotes a balanced neural network structure, mirroring the dynamics of biological systems. The proposed algorithm’s efficacy is quantitatively validated across various coding and learning scenarios, paving the way for improved real-world applications in convolutional neural networks (CNNs) and beyond.
Druh dokumentu: Article
ISSN: 3006-4023
DOI: 10.60087/jaigs.v8i1.326
Rights: CC BY
Přístupové číslo: edsair.doi...........35b5e52639b63669a508393dbaf7f89d
Databáze: OpenAIRE
Popis
Abstrakt:Recent advancements in unsupervised learning have illuminated the interplay between machine learning algorithms and biological neural processes, mainly through sparse coding methodologies. This paper explores the significance of homeostatic mechanisms in optimizing unsupervised learning performance. I propose a novel algorithm integrating nonlinear functions to control coefficient selection in sparse coding, fostering a homeostatic balance among competing neurons. By implementing histogram equalization techniques, I demonstrate that adaptive homeostasis enhances coding efficiency and learning speed, surpassing traditional approaches. My findings reveal that effective homeostatic regulation prevents redundancy in neuron selection and promotes a balanced neural network structure, mirroring the dynamics of biological systems. The proposed algorithm’s efficacy is quantitatively validated across various coding and learning scenarios, paving the way for improved real-world applications in convolutional neural networks (CNNs) and beyond.
ISSN:30064023
DOI:10.60087/jaigs.v8i1.326