Scaling Up Optuna: P2P Distributed Hyperparameters Optimization

ABSTRACT In machine learning (ML), hyperparameter optimization (HPO) is the process of choosing a tuple of values that ensures an efficient deployment and training of an AI model. In practice, HPO not only applies to ML tuning but can also be used to tune complex numerical simulations. In this conte...

Full description

Saved in:
Bibliographic Details
Published in:Concurrency and computation Vol. 37; no. 4-5
Main Author: Cudennec, Loïc
Format: Journal Article
Language:English
Published: Hoboken, USA John Wiley & Sons, Inc 28.02.2025
Wiley Subscription Services, Inc
Wiley
Series:e70008
Subjects:
ISSN:1532-0626, 1532-0634
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Be the first to leave a comment!
You must be logged in first