Weight-Specific-Decoder Attention Model to Solve Multiobjective Combinatorial Optimization Problems
The multiobjective combinatorial optimization problems (MOCOPs) have a wide range of real-world applications. Designing an effective algorithm has an important and practical significance. Due to the huge search space and limited time, it is generally difficult to obtain the optimal solution of this...
Uložené v:
| Vydané v: | Conference proceedings - IEEE International Conference on Systems, Man, and Cybernetics s. 2839 - 2844 |
|---|---|
| Hlavní autori: | , , , |
| Médium: | Konferenčný príspevok.. |
| Jazyk: | English |
| Vydavateľské údaje: |
IEEE
09.10.2022
|
| Predmet: | |
| ISSN: | 2577-1655 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Shrnutí: | The multiobjective combinatorial optimization problems (MOCOPs) have a wide range of real-world applications. Designing an effective algorithm has an important and practical significance. Due to the huge search space and limited time, it is generally difficult to obtain the optimal solution of this kind of problem by traditional exact and heuristic algorithms. Recently, learning-based algorithms have achieved good results in solving MOCOPs, but the quality and diversity of found solutions can be further improved. In this paper, we propose a Weight-Specific-Decoder Attention Model (WSDAM) to better approximate the whole Pareto set. It embeds a weight-adaptive layer into the decoder to concentrate on the information of different weight vectors. During the model training, the weight vector is sampled from the Dirichlet distribution, which can further strengthen the learning of boundary solutions. We evaluate our method on two classic MOCOPs, i.e., the multiobjective traveling salesman problem (MOTSP) and multiobjective capacitated vehicle routing problem (MOCVRP). The experimental results show that our proposed method outperforms current state-of-the-art learning-based methods in both solution quality and generalization ability. |
|---|---|
| ISSN: | 2577-1655 |
| DOI: | 10.1109/SMC53654.2022.9945568 |