A comparative study of the scalability of a sensitivity-based learning algorithm for artificial neural networks

► Researchers must now study not only accuracy but also scalability. ► Researchers are investigating machine learning scalability to large scale problems. ► The scalability of popular training algorithms for ANNs is analyzed in this research. ► The training algorithm SBLLM performs better than other...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Expert systems with applications Ročník 40; číslo 10; s. 3900 - 3905
Hlavní autori: Peteiro-Barral, Diego, Guijarro-Berdiñas, Bertha, Pérez-Sánchez, Beatriz, Fontenla-Romero, Oscar
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Amsterdam Elsevier Ltd 01.08.2013
Elsevier
Predmet:
ISSN:0957-4174, 1873-6793
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:► Researchers must now study not only accuracy but also scalability. ► Researchers are investigating machine learning scalability to large scale problems. ► The scalability of popular training algorithms for ANNs is analyzed in this research. ► The training algorithm SBLLM performs better than others in terms of scalability. ► This research contributes to the standardization of scalability studies. Until recently, the most common criterion in machine learning for evaluating the performance of algorithms was accuracy. However, the unrestrainable growth of the volume of data in recent years in fields such as bioinformatics, intrusion detection or engineering, has raised new challenges in machine learning not simply regarding accuracy but also scalability. In this research, we are concerned with the scalability of one of the most well-known paradigms in machine learning, artificial neural networks (ANNs), particularly with the training algorithm Sensitivity-Based Linear Learning Method (SBLLM). SBLLM is a learning method for two-layer feedforward ANNs based on sensitivity analysis, that calculates the weights by solving a linear system of equations. The results show that the training algorithm SBLLM performs better in terms of scalability than five of the most popular and efficient training algorithms for ANNs.
Bibliografia:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:0957-4174
1873-6793
DOI:10.1016/j.eswa.2012.12.076