Dynamic Stochastic Electric Vehicle Routing with Safe Reinforcement Learning

Saved in:
Bibliographic Details
Title: Dynamic Stochastic Electric Vehicle Routing with Safe Reinforcement Learning
Authors: Basso, Rafael, 1979, Kulcsár, Balázs Adam, 1975, Sanchez-Diaz, Ivan, 1984, Qu, Xiaobo, 1983
Source: EL FORT - El Flottor Optimering i Real-Tid EL FORT - Optimering av elfordonsflotta i Real-Tid - (Fas 2) Transportation Research Part E: Logistics and Transportation Review. 157(157)
Subject Terms: Reinforcement Learning, Approximate Dynamic Programming, Energy Consumption, Electric Vehicles, Vehicle Routing, Green Logistics
Description: Dynamic routing of electric commercial vehicles can be a challenging problem since besides the uncertainty of energy consumption there are also random customer requests. This paper introduces the Dynamic Stochastic Electric Vehicle Routing Problem (DS-EVRP). A Safe Reinforcement Learning method is proposed for solving the problem. The objective is to minimize expected energy consumption in a safe way, which means also minimizing the risk of battery depletion while en route by planning charging whenever necessary. The key idea is to learn offline about the stochastic customer requests and energy consumption using Monte Carlo simulations, to be able to plan the route predictively and safely online. The method is evaluated using simulations based on energy consumption data from a realistic traffic model for the city of Luxembourg and a high-fidelity vehicle model. The results indicate that it is possible to save energy at the same time maintaining reliability by planning the routes and charging in an anticipative way. The proposed method has the potential to improve transport operations with electric commercial vehicles capitalizing on their environmental benefits
File Description: electronic
Access URL: https://research.chalmers.se/publication/527406
https://research.chalmers.se/publication/526222
https://research.chalmers.se/publication/527406/file/527406_Fulltext.pdf
Database: SwePub
Description
Abstract:Dynamic routing of electric commercial vehicles can be a challenging problem since besides the uncertainty of energy consumption there are also random customer requests. This paper introduces the Dynamic Stochastic Electric Vehicle Routing Problem (DS-EVRP). A Safe Reinforcement Learning method is proposed for solving the problem. The objective is to minimize expected energy consumption in a safe way, which means also minimizing the risk of battery depletion while en route by planning charging whenever necessary. The key idea is to learn offline about the stochastic customer requests and energy consumption using Monte Carlo simulations, to be able to plan the route predictively and safely online. The method is evaluated using simulations based on energy consumption data from a realistic traffic model for the city of Luxembourg and a high-fidelity vehicle model. The results indicate that it is possible to save energy at the same time maintaining reliability by planning the routes and charging in an anticipative way. The proposed method has the potential to improve transport operations with electric commercial vehicles capitalizing on their environmental benefits
ISSN:13665545
DOI:10.1016/j.tre.2021.102496