Abstract
This paper introduces a new class of multi-agent discrete-time dynamical games known as dynamic graphical games, where the interactions between agents are prescribed by a communication graph structure. The graphical game results from multi-agent dynamical systems, where pinning control is used to make all the agents synchronize to the state of a command generator or target agent. The relation of dynamic graphical games and standard multi-player games is shown. A new notion of Interactive Nash equilibrium is introduced which holds if the agents are all in Nash equilibrium and the graph is strongly connected. The paper brings together discrete Hamiltonian mechanics, distributed multi-agent control, optimal control theory, and game theory to formulate and solve these multi-agent graphical games. The relationships between the discrete-time Hamilton Jacobi equation and discrete-time Bellman equation are used to formulate a discrete-time Hamilton Jacobi Bellman equation for dynamic graphical games. Proofs of Nash, stability, and convergence are given. A reinforcement learning value iteration algorithm is given to solve the dynamic graphical games.
Original language | English |
---|---|
Title of host publication | 2013 ACC conference, 17-19 June 2013, Wahington D.C. |
Place of Publication | Piscataway |
Publisher | Institute of Electrical and Electronics Engineers |
Pages | 4189-4195 |
ISBN (Print) | 978-1-4799-0177-7 |
DOIs | |
Publication status | Published - 2013 |
Event | 2013 American Control Conference (ACC 2013), June 17-19, 2013, Washington, DC, USA - Renaissance Washington, DC Downtown Hotel, Washington, DC, United States Duration: 17 Jun 2013 → 19 Jun 2013 http://acc2013.a2c2.org/ |
Conference
Conference | 2013 American Control Conference (ACC 2013), June 17-19, 2013, Washington, DC, USA |
---|---|
Abbreviated title | ACC 2013 |
Country/Territory | United States |
City | Washington, DC |
Period | 17/06/13 → 19/06/13 |
Internet address |