Effects of Extended Stochastic Gradient Descent Algorithms on Improving Latent Factor-Based Recommender Systems

High-dimensional and sparse (HiDS) matrices from recommender systems contain various useful patterns. A latent factor (LF) analysis is highly efficient in grasping these patterns. Stochastic gradient descent (SGD) is a widely adopted algorithm to train an LF model. Can its extensions be capable of f...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE robotics and automation letters Ročník 4; číslo 2; s. 618 - 624
Hlavní autori: Luo, Xin, Zhou, MengChu
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Piscataway IEEE 01.04.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:2377-3766, 2377-3766
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:High-dimensional and sparse (HiDS) matrices from recommender systems contain various useful patterns. A latent factor (LF) analysis is highly efficient in grasping these patterns. Stochastic gradient descent (SGD) is a widely adopted algorithm to train an LF model. Can its extensions be capable of further improving an LF models' convergence rate and prediction accuracy for missing data? To answer this question, this work selects two of representative extended SGD algorithms to propose two novel LF models. Experimental results from two HiDS matrices generated by real recommender systems show that compared standard SGD, extended SGD algorithms enable an LF model to achieve a higher prediction accuracy for missing data of an HiDS matrix, a faster convergence rate, and a larger model diversity.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2377-3766
2377-3766
DOI:10.1109/LRA.2019.2891986