Graph Representation Learning with Graph Transformers in Neural Combinatorial Optimization

Neural combinatorial optimization aims to use neural networks to speed up the solving process of combinatorial optimization problems, i.e., finding the optimal solution of a problem instance from a finite set of feasible solutions that minimize a given objective function. Recently, researchers have...

Full description

Saved in:
Bibliographic Details
Published in:Proceedings (IEEE International Conference on Emerging Technologies and Factory Automation) pp. 488 - 495
Main Authors: Wang, Tianze, Payberah, Amir H., Vlassov, Vladimir
Format: Conference Proceeding
Language:English
Published: IEEE 15.12.2023
Subjects:
ISSN:1946-0759
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Neural combinatorial optimization aims to use neural networks to speed up the solving process of combinatorial optimization problems, i.e., finding the optimal solution of a problem instance from a finite set of feasible solutions that minimize a given objective function. Recently, researchers have applied convolutional neural networks to predict the optimal solution's cost (defined by the objective function) to give as extra input to an exact solver to speed up the solving process. In this paper, we investigate whether graph representations that explicitly model the inherent constraints in combinatorial optimization problems would improve the performance of predicting the optimal solution's cost. Specifically, we use graph neural networks with neighborhood aggregation and graph Transformer models to capture and embed the knowledge in the graph representations of combinatorial optimization problems. We also propose a benchmark dataset containing the Traveling Salesman Problem (TSP) and Job-Shop Scheduling Problem (JSSP), and through the empirical evaluation, we show that graph Transformer models achieve an average loss decrease of 61.05% on TSP and 66.53% on JSSP compared to the baseline convolutional neural networks.
ISSN:1946-0759
DOI:10.1109/ICMLA58977.2023.00074