Approximate dynamic programming based approach to process control and scheduling

Multi-stage decision problems under uncertainty are abundant in process industries. Markov decision process (MDP) is a general mathematical formulation of such problems. Whereas stochastic programming and dynamic programming are the standard methods to solve MDPs, their unwieldy computational requir...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Computers & chemical engineering Ročník 30; číslo 10; s. 1603 - 1618
Hlavní autoři: Lee, Jay H., Lee, Jong Min
Médium: Journal Article
Jazyk:angličtina
Vydáno: Elsevier Ltd 12.09.2006
Témata:
ISSN:0098-1354, 1873-4375
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Multi-stage decision problems under uncertainty are abundant in process industries. Markov decision process (MDP) is a general mathematical formulation of such problems. Whereas stochastic programming and dynamic programming are the standard methods to solve MDPs, their unwieldy computational requirements limit their usefulness in real applications. Approximate dynamic programming (ADP) combines simulation and function approximation to alleviate the ‘curse-of-dimensionality’ associated with the traditional dynamic programming approach. In this paper, we present the ADP as a viable way to solve MDPs for process control and scheduling problems. We bring forth some key issues for its successful application in these types of problems, including the choice of function approximator and the use of a penalty function to guard against over-extending the value function approximation in the value iteration. Application studies involving a number of well-known control and scheduling problems, including dual control, multiple controller scheduling, and resource constrained project scheduling problems, point to the promising potentials of ADP.
ISSN:0098-1354
1873-4375
DOI:10.1016/j.compchemeng.2006.05.043