Decentralized Markov Decision Processes with Event-Driven Interactions

Decentralized MDPs provide a powerful formal framework for planning in multi-agent systems, but the complexity of the model limits its usefulness. We study in this paper a class of DEC-MDPs that restricts the interactions between the agents to a structured, event-driven dependency. These dependencie...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Autonomous Agents and Multiagent Systems: Proceedings, 3rd International Joint Conference, New York City, New York, 2004. s. 302 - 309
Hlavní autoři: Becker, Raphen, Zilberstein, Shlomo, Lesser, Victor
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: Washington, DC, USA IEEE Computer Society 19.07.2004
IEEE
Edice:ACM Conferences
Témata:
ISBN:9781581138641, 1581138644
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Decentralized MDPs provide a powerful formal framework for planning in multi-agent systems, but the complexity of the model limits its usefulness. We study in this paper a class of DEC-MDPs that restricts the interactions between the agents to a structured, event-driven dependency. These dependencies can model locking a shared resource or temporal enabling constraints, both of which arise frequently in practice. The complexity of this class of problems is shown to be no harder than exponential in the number of states and doubly exponential in the number of dependencies. Since the number of dependencies is much smaller than the number of states for many problems, this is significantly better than the doubly exponential (in the state space) complexity of DEC-MDPs. We also demonstrate how an algorithm we previously developed can be used to solve problems in this class both optimally and approximately. Experimental work indicates that this solution technique is significantly faster than a naive policy search approach.
Bibliografie:SourceType-Conference Papers & Proceedings-1
ObjectType-Conference Paper-1
content type line 25
ISBN:9781581138641
1581138644
DOI:10.5555/1018409.1018761