Stochastic local search and parameters recommendation: a case study on flowshop problems

The Algorithm Selection Problem (ASP) considers the use of previous knowledge regarding problem features and algorithm performance to recommend the best strategy to solve a previously unseen problem. In the application context, the usual ASP for optimization considers recommending the best heuristic...

Full description

Saved in:
Bibliographic Details
Published in:International transactions in operational research Vol. 30; no. 2; pp. 774 - 799
Main Authors: Pavelski, Lucas M., Delgado, Myriam, Kessaci, Marie‐Éléonore, Freitas, Alex A.
Format: Journal Article
Language:English
Published: Oxford Blackwell Publishing Ltd 01.03.2023
Wiley
Subjects:
ISSN:0969-6016, 1475-3995
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The Algorithm Selection Problem (ASP) considers the use of previous knowledge regarding problem features and algorithm performance to recommend the best strategy to solve a previously unseen problem. In the application context, the usual ASP for optimization considers recommending the best heuristics, whenever it faces a new similar problem instance, also known as the Per‐Instance ASP. Although ASP for heuristic recommendation is not new, selecting heuristics and also their parameters, or the Per‐instance Algorithm Configuration Problem, is still considered a challenging task. This paper investigates the use of meta‐learning to recommend six different stochastic local searches and their parameters to solve several instances of permutation flowshop problems. The proposed approach uses several problem features, including fitness landscape metrics, builds the performance database using irace, and trains different multi‐label recommendation models on a data set with more than 6000 flowshop problem instances. Experiments show that decision tree‐based machine learning models achieve good performance, and the quality of the recommendations is capable of outperforming the state‐of‐the‐art algorithm with tuned configuration.
Bibliography:ObjectType-Case Study-2
SourceType-Scholarly Journals-1
content type line 14
ObjectType-Feature-4
ObjectType-Report-1
ObjectType-Article-3
ISSN:0969-6016
1475-3995
DOI:10.1111/itor.12922