Homeostatic Mechanisms in Unsupervised Learning: Enhancing Sparse Coding through Nonlinear Normalization

Saved in:
Bibliographic Details
Title: Homeostatic Mechanisms in Unsupervised Learning: Enhancing Sparse Coding through Nonlinear Normalization
Authors: Hyacinthe Hamon
Source: Journal of Artificial Intelligence General science (JAIGS) ISSN:3006-4023. 8:35-43
Publisher Information: Open Knowledge, 2025.
Publication Year: 2025
Description: Recent advancements in unsupervised learning have illuminated the interplay between machine learning algorithms and biological neural processes, mainly through sparse coding methodologies. This paper explores the significance of homeostatic mechanisms in optimizing unsupervised learning performance. I propose a novel algorithm integrating nonlinear functions to control coefficient selection in sparse coding, fostering a homeostatic balance among competing neurons. By implementing histogram equalization techniques, I demonstrate that adaptive homeostasis enhances coding efficiency and learning speed, surpassing traditional approaches. My findings reveal that effective homeostatic regulation prevents redundancy in neuron selection and promotes a balanced neural network structure, mirroring the dynamics of biological systems. The proposed algorithm’s efficacy is quantitatively validated across various coding and learning scenarios, paving the way for improved real-world applications in convolutional neural networks (CNNs) and beyond.
Document Type: Article
ISSN: 3006-4023
DOI: 10.60087/jaigs.v8i1.326
Rights: CC BY
Accession Number: edsair.doi...........35b5e52639b63669a508393dbaf7f89d
Database: OpenAIRE
Description
Abstract:Recent advancements in unsupervised learning have illuminated the interplay between machine learning algorithms and biological neural processes, mainly through sparse coding methodologies. This paper explores the significance of homeostatic mechanisms in optimizing unsupervised learning performance. I propose a novel algorithm integrating nonlinear functions to control coefficient selection in sparse coding, fostering a homeostatic balance among competing neurons. By implementing histogram equalization techniques, I demonstrate that adaptive homeostasis enhances coding efficiency and learning speed, surpassing traditional approaches. My findings reveal that effective homeostatic regulation prevents redundancy in neuron selection and promotes a balanced neural network structure, mirroring the dynamics of biological systems. The proposed algorithm’s efficacy is quantitatively validated across various coding and learning scenarios, paving the way for improved real-world applications in convolutional neural networks (CNNs) and beyond.
ISSN:30064023
DOI:10.60087/jaigs.v8i1.326