Podrobná bibliografia
| Názov: |
Stochastic gradient descent‐based support vector machines training optimization on Big Data and HPC frameworks. |
| Autori: |
Abeykoon, Vibhatha, Fox, Geoffrey, Kim, Minje, Ekanayake, Saliya, Kamburugamuve, Supun, Govindarajan, Kannan, Wickramasinghe, Pulasthi, Perera, Niranda, Widanage, Chathura, Uyar, Ahmet, Gunduz, Gurhan, Akkas, Selahatin |
| Zdroj: |
Concurrency & Computation: Practice & Experience; 4/10/2022, Vol. 34 Issue 8, p1-12, 12p |
| Predmety: |
SUPPORT vector machines, BIG data, PROGRAMMING languages, HYBRID systems, MACHINE learning |
| Abstrakt: |
Summary: Support vector machines (SVM) is a widely used machine learning algorithm. With the increasing amount of research data nowadays, understanding how to do efficient training is more important than ever. This article discusses the performance optimizations and benchmarks related to providing high‐performance support for SVM training. In this research, we have focused on a highly scalable gradient descent‐based approach to implementing the core SVM algorithm. In providing a scalable solution, we have designed optimized high‐performance computing and dataflow‐oriented SVM implementations. A high‐performance computing approach means the algorithm is implemented with the bulk synchronous parallel (BSP) model. In addition, we analyzed the language level optimizations and math kernel optimizations on a prominent HPC modeling programming language (C++) and dataflow modeling programming language (Java). In the experiments, we compared the performance of classic HPC models, classic dataflow models, and hybrid models designed on classic HPC and dataflow programming models. Our research illustrates a scientific approach in designing the SVM algorithm at scale in classic HPC, dataflow, and hybrid systems. [ABSTRACT FROM AUTHOR] |
|
Copyright of Concurrency & Computation: Practice & Experience is the property of Wiley-Blackwell and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.) |
| Databáza: |
Complementary Index |